Near-term Applications of Quantum Computing
from
Wednesday, 6 December 2017 (08:00)
to
Thursday, 7 December 2017 (19:30)
Monday, 4 December 2017
Tuesday, 5 December 2017
Wednesday, 6 December 2017
08:40
Welcome and Introduction
-
Marcela Carena
(Fermilab)
Joseph Lykken
(Fermilab)
Welcome and Introduction
Marcela Carena
(Fermilab)
Joseph Lykken
(Fermilab)
08:40 - 09:00
Room: Curia II
09:00
Quantum Computing Testbed Approaches
-
James Amundson
(Fermilab)
Quantum Computing Testbed Approaches
James Amundson
(Fermilab)
09:00 - 10:00
Room: Curia II
Until recently, the term “applied quantum computing” was best used as an answer to the question “What is a good example of an oxymoron?” Now, however, quantum computing hardware with significant capabilities is on the very near horizon. I describe how we at Fermilab are taking a testbed the topic of applied quantum computing. Even though the killer application for quantum computing in high energy physics has yet to be developed, I describe the steps we are taking toward identifying and implementing quantum solutions to high energy physics problems.
10:00
Machine Learning of a Higgs Decay Classifier via Quantum Annealing
-
Joshua Job
(University of Southern California)
Machine Learning of a Higgs Decay Classifier via Quantum Annealing
Joshua Job
(University of Southern California)
10:00 - 11:00
Room: Curia II
In this talk, we describe how we used quantum and classical annealing (probabilistic techniques for approximating the global maximum or minimum of a given function) to solve a Higgs-signal-versus-background machine learning optimization problem, and mapped it to a problem of finding the ground state of a corresponding Ising spin model. We build a set of weak classifiers based on the kinematic observables of the Higgs decay photons, which we then use to construct a strong classifier. This strong classifier is highly resilient against overtraining and against errors in the correlations of the physical observables in the training data, which may result from the use of event generators in high-energy physics. We show that the resulting quantum and classical annealing-based classifier systems perform comparably to the state-of-the-art machine learning methods that are currently used in particle physics for this test case. However, in contrast to these methods, the annealing-based classifiers are simple functions of directly interpretable experimental parameters with clear physical meaning. The annealer-trained classifiers demonstrate some advantage over traditional machine learning methods for small training datasets. Given the relative simplicity of the algorithm and its robustness to error, this technique may find application in other areas of experimental particle physics, such as real-time decision making in event-selection problems and provides a proof of principle for future work on machine learning applications of quantum and digital annealing machines.
11:00
Break
Break
11:00 - 11:15
Room: Art Gallery
11:15
Statistical Analysis of Quantum Computing Experiments
-
Yazhen Wang
(University of Wisconsin, Madison)
Statistical Analysis of Quantum Computing Experiments
Yazhen Wang
(University of Wisconsin, Madison)
11:15 - 12:15
Room: Curia II
12:15
Lunch
Lunch
12:15 - 13:30
Room: Fermilab Cafeteria
13:30
Systems and Software for Scientific Discovery with Quantum Computing
-
Travis Humble
(Oak Ridge National Laboratory)
Systems and Software for Scientific Discovery with Quantum Computing
Travis Humble
(Oak Ridge National Laboratory)
13:30 - 14:30
Room: Curia II
We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while also maintain existing computing infrastructure. We then elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for controlling quantum devices.
14:30
Software for Large-Scale and Near-Term Quantum Computing
-
Fred Chong
(University of Chicago)
Software for Large-Scale and Near-Term Quantum Computing
Fred Chong
(University of Chicago)
14:30 - 15:30
Room: Curia II
In this talk, I will discuss our experiences with developing an open-source tool chain for large-scale quantum computing, and our plans for re-targeting these tools for near-term, small-scale physical prototypes. The Scaffold tools are an extensive set of compilation and resource estimation tools for large-scale quantum computing. Scaffold leverages the LLVM compiler framework, as well as parallel mapping and quantum rotation generation tools. Scaffold was designed for scalability, targeting quantum machines with up to millions of quantum bits. Scaffold has allowed us to explore a range of architectural and compiler issues and has contributed to many other research projects across the world. Our future efforts, however, will focus on specializing Scaffold to target smaller-scale machines. Specifically, we plan to expose more machine features and use deep optimization to help close the gap between practical algorithms and prototype machines.
15:30
Break
Break
15:30 - 16:00
Room: Art Gallery
16:00
Colloquium: Adventures in quantum optimization with noisy qubits
-
Daniel Lidar
(University of Southern California)
Colloquium: Adventures in quantum optimization with noisy qubits
Daniel Lidar
(University of Southern California)
16:00 - 17:00
Room: WH 1 West
Quantum information processing holds great promise, yet large-scale, general purpose “universal" quantum computers capable of solving hard problems are not yet available despite 20+ years of immense worldwide effort. However, special purpose quantum information processors, such as the quantum simulators originally envisioned by Feynman, appear to be within reach. Another type of special purpose quantum information processor is a quantum annealer, designed to speed up the solution to classical optimization problems. In October 2011 USC and Lockheed-Martin jointly founded a quantum computing center housing a commercial quantum annealer built by D- Wave Systems. Starting with 108 qubits, two generations later the current processor at USC has 1098 qubits, and the latest generation deployed elsewhere already has close to 2048 qubits. These processors use superconducting flux qubits to try to find the ground states of Ising spin-glass problems with as many spins as qubits, an NP-hard problem with numerous applications. There has been much controversy surrounding the D-Wave processors, concerning whether they are sufficiently quantum to offer any advantage over classical computing. After introducing quantum annealing I will survey the work we have done to test the D-Wave processors for quantum effects, to test for quantum enhancements by benchmarking against highly optimized classical algorithms, and to perform error correction.
Thursday, 7 December 2017
09:00
Evidence for a Scaling Advantage on a Quantum Annealer
-
Daniel Lidar
(University of Southern California)
Evidence for a Scaling Advantage on a Quantum Annealer
Daniel Lidar
(University of Southern California)
09:00 - 10:00
Room: Curia II
The observation of an unequivocal quantum speedup remains an elusive objective for quantum computing. In this talk I will present the first, and so far only example of a scaling advantage for an experimental quantum annealer. In comparison to classical annealing, we find that the D-Wave 2000Q processor exhibits certifiably better scaling than both simulated annealing and spin-vector Monte Carlo. However, we do not find evidence for a quantum speedup: simulated quantum annealing (a variant of quantum Monte Carlo) exhibits the best scaling by a significant margin. Our construction of instance classes exhibiting this behavior opens up the possibility of generating many new such classes, and for further definitive assessments of scaling advantages using current and future quantum annealing devices.
10:00
Quantum Information for Fundamental Physics
-
Daniel Carney
(NIST / University of Maryland)
Quantum Information for Fundamental Physics
Daniel Carney
(NIST / University of Maryland)
10:00 - 11:00
Room: Curia II
The tried-and-true method for probing fundamental physics is to measure scattering probabilities with colliders. Recent advances in quantum information-based theory and experimental technologies suggest new methods for understanding elementary physics. In this vein, I will discuss some results on the quantum structure of scattering states, and sketch some preliminary ideas about trying to use novel information-theoretic observables and techniques to explore fundamental theories at energies accessible in labs today.
11:00
Break
Break
11:00 - 11:15
Room: Curia II
11:15
Simulating Quantum Field Theories on Quantum Computers
-
Stephen Jordan
(NIST / University of Maryland)
Simulating Quantum Field Theories on Quantum Computers
Stephen Jordan
(NIST / University of Maryland)
11:15 - 12:15
Room: Curia II
In some regimes, such as strong coupling, quantum field theory dynamics are difficult to simulate using conventional techniques. In this talk I will describe my joint work with John Preskill and Keith Lee developing quantum algorithms for simulating quantum field theories. I will also comment on potential applications of near-term "pre-threshold" quantum computers to quantum field theory problems.
12:15
Lunch
Lunch
12:15 - 13:30
Room: Fermilab Cafeteria
13:30
Quantum Simulations of Abelian and non- Abelian Gauge Theories
-
Uwe-Jens Wiese
(University of Bern)
Quantum Simulations of Abelian and non- Abelian Gauge Theories
Uwe-Jens Wiese
(University of Bern)
13:30 - 14:30
Room: Curia II
Besides lattice QCD in particle physics, strongly coupled gauge theories arise, for example, in the condensed matter physics of spin liquids, or in the quantum information theory of Kitaev's toric code, which is a Z(2) lattice gauge theory. Numerical simulations of gauge theories on classical computers, in particular, at high fermion density or in out-of-equilibrium situations, suffer from severe sign problems that prevent the importance sampling underlying Monte Carlo calculations. Quantum simulators are accurately controllable quantum devices that mimic other quantum systems. They do not suffer from sign problems, because their hardware is intrinsically quantum mechanical. Recently, trapped ions, following a laser-driven stroboscopic discrete time evolution through a sequence of quantum gate operations, have been used as a digital quantum simulator for particle-anti-particle pair creation in the Schwinger model. Analog quantum simulators, on the other hand, follow the continuous time-evolution of a tunable model Hamiltonian. Using ultra-cold atoms in optical lattices, analog quantum simulators have been designed for Abelian and non-Abelian lattice gauge theories. Their experimental realization is a challenge for the foreseeable future, which holds the promise to access the real-time dynamics of string breaking, the out-of-equilibrium decay of a false vacuum, or the evolution of a chiral condensate after a quench, from first principles. Quantum link models which realize gauge theories including QCD not with classical fields but with discrete quantum degrees of freedom, are ideally suited for implementation in quantum matter. For example, alkaline-earth atoms, whose nuclear spin represents an SU(N) degree of freedom, naturally embody fermionic rishon constituents of gluons. CP(N-1) models, which are toy models for QCD, can be quantum simulated in a similar way via SU(N) quantum spin ladders.
14:30
Quantum Simulating Lattice Gauge Theories with Optical Lattices
-
Yannick Meurice
(U. of Iowa)
Quantum Simulating Lattice Gauge Theories with Optical Lattices
Yannick Meurice
(U. of Iowa)
14:30 - 15:30
Room: Curia II
Optical lattices have been used successfully to quantum simulate the Bose-Hubbard model. We briefly review recent proposals to use similar procedures for lattice gauge theories. The long term objectives are to deal with sign problems and the real time evolution, which is not possible with classical computations. We introduce a gauge-invariant formulation of the Abelian Higgs model in 1+1 dimensions obtained with the tensor renormalization group method. We propose an approximate realization using cold atoms in an optical lattice with a ladder structure. Recently developed Rydberg’s atom manipulations allow to create nearest neighbor interactions with the desired strength. An experimental proof of principle would be to try first simpler examples: the Ising and O(2) models. We report on recent progress in this direction.
15:30
Wrap-up / Fermilab tours
Wrap-up / Fermilab tours
15:30 - 18:30
Room: Curia-II