QTML Keynote and Invited Talks



Keynote Speakers

Jacob Biamonte
Keynote Speaker

Skolkovo Institute of Science and Technology
Russian Federation

Sankar Das Sarma
Keynote Speaker

University of Maryland
USA

Gitta Kutyniok
Keynote Speaker

Ludwig Maximilian University of Munich
Germany

Seth Lloyd
Keynote Speaker

Massachusetts Institute of Technology
USA

Hartmut Neven
Keynote Speaker

Google Quantum AI
USA




Invited Speakers

Shahnawaz Ahmed
Invited Speaker

Chalmers University of Technology
Sweden

Patrick Coles
Invited Speaker

The University of New Mexico
New Mexico

Marcello Dalmonte
Invited Speaker

International Centre for Theoretical Physics
Italy


Tutorial Speaker

Title: Trainability and generalization of quantum machine learning models
Abstract:Quantum machine learning (QML) is a leading proposal for near-term quantum advantage. However, recent progress in understanding the training landscapes for QML models paints a concerning picture. Exponentially vanishing gradients - barren plateaus - occur for circuits that are deep or noisy or generate much entanglement. On the flip side, some QML architectures are immune to barren plateaus. The dynamical Lie algebra has been connected to both barren plateaus and overparameterization, suggesting that we could use algebraic properties to engineer favorable training landscapes. Training is only half of the story for QML, as good generalization to testing data is needed as well. We have found surprising good generalization properties for QML models in two ways: (1) In-distribution generalization is guaranteed when the training data size is roughly equal to the number of model parameters, and (2) Out-of-distribution generalization is guaranteed for locally scrambled ensembles, allowing for product-state to entangled-state generalization. In this talk, I will attempt to overview our current understanding of both the trainability and generalization of QML models.
Title: Information scrambling for navigating the learning landscape of a quantum machine learning model
Abstract:In this talk, I will focus on quantum machine learning, particularly the Restricted Boltzmann Machine (RBM), as it emerged to be a promising alternative approach leveraging the power of quantum computers. The workhorse of our technique is a shallow neural network encoding the desired state of the system with the amplitude computed by sampling the Gibbs oltzmann distribution using a quantum circuit and the phase information obtained classically from the nonlinear activation of a separate set of neurons. In Addition to present the successful applications for electronic structure of two-dimensional materials I will discuss and illustrate that the imaginary components of out-of-time correlators can be related to conventional measures of correlation like mutual information. Such an analysis offers important insights into the training dynamics by unraveling how quantum information is scrambled through such a network introducing correlation among its constituent sub-systems. This approach not only demystifies the training of quantum machine learning models but can also explicate the capacitive quality of the model.
Title: Combining Gradient Ascent and Feedback Control
Abstract:Optimal control algorithms are essential for improving modern quantum devices. While model-based gradient techniques like GRAPE (gradient-ascent pulse engineering) are powerful tools for efficiently finding control pulses, they are not applicable to feedback scenarios, where the control must depend on measurement results. Conversely, modern model-free reinforcement learning techniques can easily deal with feedback, but they are not very efficient, since they do not make use of our knowledge of the underlying physics model. In this talk, I will present our new approach (termed feedback-GRAPE) that enables us to combine model-based techniques with quantum feedback. I will give examples of several tasks that can be efficiently solved using that new approach.
Title: Introduction to quantum (statistical) learning theory
Abstract: Given the key position of Machine Learning in everyday life, it is crucial to (formally) understand what can be efficiently learned or not. Learning theory is thus the field that studies learning problems from a complexity theory point of view, where we define formal models of learning and prove (in)feasibility results.
With the advent of quantum computing, quantum speedups on learning tasks is a sought-after application. As in the classical setting, we are also interested in having a clear picture of what can be efficiently learned or not, especially when quantum advantage in such a setting is achieved. This is the goal of quantum learning theory.
In this tutorial, I will present the field of learning theory, describing the models and main results. In particular, I will focus on quantum statistical learning theory, a model where data is accessed by its statistics.
No previous knowledge of learning theory is expected.
Title: Data mining the output of quantum simulators - from critical behavior to algorithmic complexity
Abstract: Recent experiments with quantum simulators and noisy intermediate-scale quantum devices have demonstrated unparalleled capabilities of probing many-body wave functions, via directly probing them at the single quantum level via projective measurements. However, very little is known about to interpret and analyse such huge datasets. In this talk, I will show how it is possible to provide such characterisation of many-body quantum hardware via a direct and assumption-free data mining. The core idea of this programme is the fact that the output of quantum simualtors and computers can be construed as a very high-dimensional manifold. Such manifold can be characterised via basic topological concepts, in particular, by their intrinsic dimension. Exploiting state of the art tools in non-parametric learning, I will discuss theoretical results for both classical and quantum many-body spin systems that illustrate how data structures undergo structural transitions whenever the underlying physical system does, and display universal (critical) behavior in both classical and quantum mechanical cases. I will conclude with remarks on the applicability of our theoretical framework to synthetic quantum systems (quantum simulators and quantum computers), and emphasize its potential to provide a direct, scalable measure of Kolmogorov complexity of output states.
Title: Exploiting machine learning for quantum dynamics and viceversa
Abstract: This talk is about the recent theoretical research of our group on the reciprocal link between machine learning and the dynamics of quantum systems. First, we will review how artificial neural networks can be used as variational trial wavefunctions to simulate closed and open quantum systems. Then, we will show how the dynamics of quantum hardware can be exploited to create kernel machines to perform advanced tasks.
Title: Machine learning quantum states and operations: from neural networks to optimization on manifolds
Abstract: Machine learning techniques have found recent applications in quantum tomography. The underlying idea is to use an efficient ansatz to represent a quantum state or process and learn it from data. Neural network architectures from Restricted Boltzmann Machines to Recurrent Neural Networks have been proposed as ansatzes for quantum states. Such ansatzes can be trained using standard gradient-based optimization to directly estimate a quantum state's density matrix or allow efficient sampling of measurement outcomes. Similar ideas can be applied to learn a quantum process. In this talk, we will discuss several such machine learning methods for quantum state and process tomography. We will elucidate the necessary ingredients to apply machine learning to the tomography problem - from using physics-based constraints on the ansatzes to constraints on the training itself such as gradient-descent on a manifold. We will also compare machine learning to existing standard techniques such as maximum likelihood estimation, compressed sensing or projection-based algorithms to show how ideas from machine learning can enhance the set of tools for quantum characterization.
Title:
Abstract:
Title: Towards an Artificial Muse for new Ideas in Quantum Physics
Abstract: Articial intelligence (AI) is a potentially disruptive tool for physics and science in general. One crucial question is how this technology can contribute at a conceptual level to help acquire new scientic understanding or inspire new surprising ideas. I will talk about how AI can be used as an articial muse in quantum physics, which suggests surprising and unconventional ideas and techniques that the human scientist can interpret, understand and generalize.
[1] Krenn, Kottmann, Tischler, Aspuru-Guzik, Conceptual understanding through efficient automated design of quantum optical experiments. Physical Review X 11(3), 031044 (2021).
[2] Krenn, Pollice, Guo, Aldeghi, Cervera-Lierta, Friederich, Gomes, Hase, Jinich, Nigam, Yao, Aspuru-Guzik, On scientic understanding with articial intelligence. arXiv:2204.01467 (2022).
[3] Krenn, Zeilinger, Predicting research trends with semantic and neural networks with an application in quantum physics. PNAS 117(4), 1910-1916 (2020).
Title:
Abstract:
Title: Attractor Neural Networks: storage capacity and learning
Abstract: One way to understand quantum neural networks is to adapt classical cases into the quantum regime. Attractor neural networks are able to retrieve different configurations after they are applied several times allowing to associate each initial state with the closest stable configuration of the network. The quantum case is obtained by studying which are the completely positive trace preserving (CPTP) maps that hold the larger number of stationary states. I will show that in this case, the When talking states This is done by can use a classical attractor neural network of We study the storage capacity of quantum neural networks (QNNs), described by the attractor associated to an arbitrary input state is the one minimizing their relative entropy. We will discuss why this networks outperform the classical ones
Title:
Abstract:
Title:
Abstract:
Title:
Abstract:
Title:
Abstract:
Title:
Abstract: