Speakers


Speakers

Speakers

Talks

Barry
Sanders
University of Calgary, Canada

History, status and future of quantum algorithms and heuristics

This talk presents a historical overview, current developments, and future directions for quantum algorithms and quantum heuristics, including quantum annealing and variational quantum algorithms. The discussion includes prospects for achieving quantum advantage and outlines the hardware requirements necessary for quantum computers to achiese such an advantage.

Sergii
Strelchuk
University of Oxford, England

Surviving as a Classical learner in a Quantum World

Computational learning theory—and, in particular, the Probably Approximately Correct (PAC) model—supplies a rigorous mathematical framework for assessing how well algorithms can generalise from finite data. PAC theory asks whether a learner can, with high probability, return a hypothesis whose error falls within a specified tolerance of the target concept. These guarantees inform the design of more reliable, data-efficient algorithms and delineate the fundamental limits of learnability. In this tutorial, we’ll take an intuitive, ground-up tour of classical PAC learning—beginning with its real-world motivations and ending with its core definitions and guarantees. Then, we’ll investigate how quantum PAC learning pushes the framework into a regime where quantum resources may slash sample or runtime requirements. Along the way, we’ll discuss where quantum speed-ups are plausible and where classical methods are the best we can do, building up to the next tutorial, which will introduce the rich world of quantum computational learning tasks and the pursuit of quantum advantage.

Amira
Abbas
Google Quantum AI, California, USA

The Importance of Quantum Learning Theory

As motivated in the previous tutorial (see abstract by Sergii Strelchuk), the Probably Approximately Correct (PAC) learning framework provides a rigorous theoretical foundation for understanding the learnability of concepts from data. In fact, much of the existing theory for machine learning rests on a subtle extension of this fundamental model, called agnostic learning, which removes some of the unrealistic assumptions imposed by the PAC framework. But designing an analogous quantum agnostic/PAC framework is somewhat nuanced and could vastly influence any learning advantage attainable by a quantum algorithm. In this tutorial, we will review various proposals for quantum learning tasks. In doing so, we will go through examples of known learning separations and explore open research directions in quantum learning theory.

Davide
Venturelli
USRA Research Institute for Advanced Computer Science, USA

Quantum Approximate Optimization

After a short introduction on the subject, we discuss the known results on QAOA theory and numerics from numerous groups around the world and review the most recent modifications to the heuristic variational algorithm that show promising performance.
We discuss implementation challenges, benchmarking and ideas to scale pre and post fault-tolerance.

Nana
Liu
Shanghai Jiao Tong University, China

Interplay between differential equations and quantum machine learning

More details coming soon.

Giuseppe
Sergioli
University of Cagliari, Italy

From Quantum Information to Machine Learning: A Quantum-Inspired Approach – Bridging Theory and Applications

In this lecture, we will illustrate how quantum information theory can benefit machine learning—even without the need for quantum computers. In particular, we will discuss what is known as Quantum-inspired Machine Learning (QiML). While quantum algorithms can significantly accelerate computation, the advantage of the QiML approach lies not in reducing computing time but in improving result accuracy. We will focus specifically on the problem of supervised classification, demonstrating how drawing inspiration from quantum state discrimination can lead to excellent performance in both binary and multiclass scenarios. After an initial theoretical section, we will explore various practical applications. First, we will present an application of a quantum-inspired classifier in the biomedical field; next, we will show how, in principle, the quantum-inspired approach can be implemented on a quantum computer—combining two benefits: enhanced classification accuracy and accelerated algorithmic processing. Finally, we will demonstrate that the quantum-inspired approach is also extremely promising for classifying quantum properties, such as entanglement, separability, and non-locality.

Lirandë
Pira
Centre for Quantum Technologies, Singapore

In Search of Quantum Neural Networks

Biological neural networks underpin our cognition, creativity, and ability to find meaning. Artificial neural networks, inspired by their biological counterparts, have transformed computing, extending human capabilities in ways we are only beginning to understand. Quantum neural networks (QNNs) emerge at the intersection of quantum mechanics and machine learning, prompting us to think whether they can push these boundaries every further.
While classical neural networks dominate machine learning, their foundations are not without limitations. As quantum computing advances, the search for quantum neural networks has sparked theoretical and experimental efforts. Can QNNs offer a fundamental advantage over classical models, or are they merely a reformulation in disguise? What principles should govern their design? And most crucially—what problems could they solve that classical AI cannot?
This lecture will explore the motivations behind QNNs, examine current proposals, and discuss the challenges of constructing meaningful quantum learning architectures. Along the way, we will revisit the original inspiration behind neural networks and question whether quantum mechanics offers a fundamentally new paradigm for intelligence.

Alessandro
Luongo
Centre for Quantum Technologies, Singapore

Classical data in quantum computers: methods and applications

Loading and representing classical data on a quantum computer is a critical subroutine in writing quantum algorithms with provable speedups over classical ones.
We begin by examining key methods for encoding classical data into quantum form, including binary, amplitude, block, angle, and graph encodings. Next, we introduce a formal model of quantum computation that incorporates a quantum memory device—capable of querying classical data in superposition—allowing a precise definition of quantum runtime.
We then delve into circuit implementations of quantum memory devices, such as bucket-brigade QRAM, state preparation circuits, and others, discussing their properties and trade-offs.
Finally, we conclude with an overview of quantum machine learning algorithms that leverage this computational model and highlight open research challenges.

Hidden blocks

Program

TBD

Special Events

TBD

Map of the Venue

TBD