Speaker
Mr
Guillaume Verdon
(Institute for Quantum Computing)
Description
In recent months, the field of Quantum Machine Learning (QML) has had
numerous advances and a rapid growth of interest from academia and
industry alike. Recent works have focused on a particular class of QML
algorithms, the so-called quantum variational algorithms (often called
quantum neural networks), where an optimization over a set of
parametrized quantum circuit ansatze is performed in order to learn
certain quantum states or quantum transformations. The explicit
connection between these quantum parametric circuits and neural
networks from classical deep learning had so far remained elusive. In
this talk, we will establish how to port over classical neural
networks as quantum parametric circuits, and we will further introduce
a quantum-native backpropagation principle which can be leveraged to
train any quantum parametric network. We will present two main quantum
optimizers leveraging this quantum backpropagation principle: Quantum
Dynamical Descent (QDD), which uses quantum-coherent dynamics to
optimize network parameters, and Momentum Measurement Gradient Descent
(MoMGrad), which is a quantum-classical analogue of QDD. We will
briefly cover multiple applications of QDD/MoMGrad to various problems
of quantum information learning, and how to use these optimizers to
train classical neural networks in a quantum fashion. Furthermore, we
will show how to efficiently train hybrid networks comprised of
classical neural networks and quantum parametric circuits, running on
classical and quantum processing units, respectively.
Talk based on [\arXiv{1806.09729}].