Speaker
Isaac Kim
(Stanford University)
Description
I will argue that even a medium-scale (50 to \ensuremath{\sim}100 qubits) quantum computer
can significantly speed up the existing tensor network
calculations. This is because the classical tensor network contraction
algorithms have hit a plateau, and because the contraction time on a
quantum computer scales much favorably compared to the classical
methods. What makes this proposal realistic is the fact that the
method is noise-resilient. Under the standard noise model, the effect
of noise on low-point correlation functions remains controlled even in
the large system limit. I expect this method to primarily help
understand challenging quantum many-body systems, but we will also
muse on other speculative possibilities (\emph{e.g.}, machine learning) as
well.