Variational Neural Annealing

Abstract:

Many important challenges in science and technology can be cast as optimization problems. When viewed in a statistical physics framework, these can be tackled by simulated annealing, where a gradual cooling procedure helps search for ground state solutions of a target Hamiltonian. While powerful, simulated annealing is known to have prohibitively slow sampling dynamics when the optimization landscape is rough or glassy. In this talk I will show that by generalizing the target distribution with a parameterized model, an analogous annealing framework based on the variational principle can be used to search for ground state solutions. Autoregressive models such as recurrent neural networks provide ideal parameterizations since they can be exactly sampled without slow dynamics even when the model encodes a rough landscape. We implement this procedure in the classical and quantum settings on several prototypical spin glass Hamiltonians, and find that it significantly outperforms traditional simulated annealing in the asymptotic limit, illustrating the potential power of this yet unexplored route to optimization.  

Time: 20:00 (Taipei time), 08:00 (Toronto time)
Venue: Zoom [Registration] is required