Speakers
Speakers
Program
TBD
Special Events
TBD
Map of the Venue
TBD
Talks

Abbas
Google Quantum AI, California, USA
History, status and future of quantum algorithms and heuristics
I provide a history, current status and expected future for quantum algorithms and quantum heuristics (such as quantum annealing and variational quantum algorithms) including potential quantum advantage and requirements for quantum-computer hardware to achieve such an advantage.
Surviving as a Classical learner in a Quantum World
Computational learning theory, and particularly the framework of Probably Approximately Correct (PAC) learning, provides a rigorous mathematical foundation for understanding how algorithms can generalize from finite training data. In PAC learning, we seek to determine whether, with high probability, a learning algorithm can produce a hypothesis that is close (within some error bound) to the true concept being learned. By analyzing these conditions — such as the size of the training sample, the complexity of the hypothesis space, and the error tolerance — we can derive theoretical guarantees on the feasibility and efficiency of learning tasks. This guides the design of more robust learning algorithms and also sheds light on why some problems are inherently harder to learn than others.
Quantum PAC learning extends the classical PAC framework by allowing the learner to receive or manipulate data in quantum form (e.g., using quantum examples or quantum queries). By taking advantage of the properties of quantum information quantum PAC learning can, in some cases, reduce the sample or time complexity of certain learning problems compared to their classical counterparts. Quantum PAC learning studies both how and whether quantum resources can provide speedups or sample-efficiency improvements for standard learning tasks, providing insights into the power and limitations of quantum models for generalization. In my tutorial I will provide a gentle introduction to basics of classical learning theory its quantum generalization and survey some of the most recent advances in this field.
In Search of Quantum Neural Networks
Biological neural networks underpin our cognition, creativity, and ability to find meaning. Artificial neural networks, inspired by their biological counterparts, have transformed computing, extending human capabilities in ways we are only beginning to understand. Quantum neural networks (QNNs) emerge at the intersection of quantum mechanics and machine learning, prompting us to think whether they can push these boundaries every further.
While classical neural networks dominate machine learning, their foundations are not without limitations. As quantum computing advances, the search for quantum neural networks has sparked theoretical and experimental efforts. Can QNNs offer a fundamental advantage over classical models, or are they merely a reformulation in disguise? What principles should govern their design? And most crucially—what problems could they solve that classical AI cannot?
This lecture will explore the motivations behind QNNs, examine current proposals, and discuss the challenges of constructing meaningful quantum learning architectures. Along the way, we will revisit the original inspiration behind neural networks and question whether quantum mechanics offers a fundamentally new paradigm for intelligence.
Classical data in quantum computers: methods and applications
Loading and representing classical data on a quantum computer is a critical subroutine in writing quantum algorithms with provable speedups over classical ones.
We begin by examining key methods for encoding classical data into quantum form, including binary, amplitude, block, angle, and graph encodings. Next, we introduce a formal model of quantum computation that incorporates a quantum memory device—capable of querying classical data in superposition—allowing a precise definition of quantum runtime.
We then delve into circuit implementations of quantum memory devices, such as bucket-brigade QRAM, state preparation circuits, and others, discussing their properties and trade-offs.
Finally, we conclude with an overview of quantum machine learning algorithms that leverage this computational model and highlight open research challenges.
From Quantum Information to Machine Learning: A Quantum-Inspired Approach – Bridging Theory and Applications
In this lecture, we will illustrate how quantum information theory can benefit machine learning—even without the need for quantum computers. In particular, we will discuss what is known as Quantum-inspired Machine Learning (QiML). While quantum algorithms can significantly accelerate computation, the advantage of the QiML approach lies not in reducing computing time but in improving result accuracy. We will focus specifically on the problem of supervised classification, demonstrating how drawing inspiration from quantum state discrimination can lead to excellent performance in both binary and multiclass scenarios. After an initial theoretical section, we will explore various practical applications. First, we will present an application of a quantum-inspired classifier in the biomedical field; next, we will show how, in principle, the quantum-inspired approach can be implemented on a quantum computer—combining two benefits: enhanced classification accuracy and accelerated algorithmic processing. Finally, we will demonstrate that the quantum-inspired approach is also extremely promising for classifying quantum properties, such as entanglement, separability, and non-locality.
Quantum Approximate Optimization
After a short introduction on the subject, we discuss the known results on QAOA theory and numerics from numerous groups around the world and review the most recent modifications to the heuristic variational algorithm that show promising performance.
We discuss implementation challenges, benchmarking and ideas to scale pre and post fault-tolerance.