QuICS Seminar
In recent years, deep learning has achieved great success in many areas of artificial intelligence, such as computer vision, speech recognition, natural language processing, etc. Its central idea is to build a hierarchy of successively more abstract representations of data (e.g. image, audio, text) by using a neural network with many layers. Training such a deep neural network, however, can be very time-consuming. In this talk, we will investigate whether quantum computing can make this process more efficient. We will focus on deep Boltzmann machine (DBM), an interesting deep model that has many theoretical merits but has been impractical for large-scale problems due to the difficulty of its training and inference. We will present a quantum-walk-based algorithm for preparing a coherent version of the Gibbs state of any given DBM. This algorithm allows us to quickly estimate the gradient of the log-likelihood function, and thus to find a locally optimal solution to the DBM learning problem. We will also present a quantum algorithm for evaluating the generative performance of any given DBM. Numerical results indicate that our algorithms learn better models than existing classical algorithms based on mean-field variational inference and Markov chain Monte Carlo method.