**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Quantum neural network

Summary

Quantum neural networks are computational neural network models which are based on the principles of quantum mechanics. The first ideas on quantum neural computation were published independently in 1995 by Subhash Kak and Ron Chrisley, engaging with the theory of quantum mind, which posits that quantum effects play a role in cognitive function. However, typical research in quantum neural networks involves combining classical artificial neural network models (which are widely used in machine learning for the important task of pattern recognition) with the advantages of quantum information in order to develop more efficient algorithms. One important motivation for these investigations is the difficulty to train classical neural networks, especially in big data applications. The hope is that features of quantum computing such as quantum parallelism or the effects of interference and entanglement can be used as resources. Since the technological implementation of a quantum computer is still in a premature stage, such quantum neural network models are mostly theoretical proposals that await their full implementation in physical experiments.
Most Quantum neural networks are developed as feed-forward networks. Similar to their classical counterparts, this structure intakes input from one layer of qubits, and passes that input onto another layer of qubits. This layer of qubits evaluates this information and passes on the output to the next layer. Eventually the path leads to the final layer of qubits. The layers do not have to be of the same width, meaning they don't have to have the same number of qubits as the layer before or after it. This structure is trained on which path to take similar to classical artificial neural networks. This is discussed in a lower section. Quantum neural networks refer to three different categories: Quantum computer with classical data, classical computer with quantum data, and quantum computer with quantum data.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related people (1)

Related courses (19)

Related publications (2)

Related lectures (142)

ME-390: Foundations of artificial intelligence

This course provides the students with 1) a set of theoretical concepts to understand the machine learning approach; and 2) a subset of the tools to use this approach for problems arising in mechanica

PHYS-467: Machine learning for physicists

Machine learning and data analysis are becoming increasingly central in sciences including physics. In this course, fundamental principles and methods of machine learning will be introduced and practi

EE-556: Mathematics of data: from theory to computation

This course provides an overview of key advances in continuous optimization and statistical analysis for machine learning. We review recent learning formulations and models as well as their guarantees

Kernel Methods: Neural NetworksPHYS-467: Machine learning for physicists

Covers the fundamentals of neural networks, focusing on RBF kernels and SVM.

Deep Learning FundamentalsME-390: Foundations of artificial intelligence

Introduces deep learning, from logistic regression to neural networks, emphasizing the need for handling non-linearly separable data.

Optimization for Machine Learning: Non-convexCS-439: Optimization for machine learning

Explores non-convex optimization in machine learning, covering gradient descent, trajectory analysis, and linear models.

In this work, we first revise some extensions of the standard Hopfield model in the low storage limit, namely the correlated attractor case and the multitasking case recently introduced by the authors. The former case is based on a modification of the Hebbian prescription, which induces a coupling between consecutive patterns and this effect is tuned by a parameter a. In the latter case, dilution is introduced in pattern entries, in such a way that a fraction d of them is blank. Then, we merge these two extensions to obtain a system able to retrieve several patterns in parallel and the quality of retrieval, encoded by the set of Mattis magnetizations {m(mu)}, is reminiscent of the correlation among patterns. By tuning the parameters d and a, qualitatively different outputs emerge, ranging from highly hierarchical to symmetric. The investigations are accomplished by means of both numerical simulations and statistical mechanics analysis, properly adapting a novel technique originally developed for spin glasses, i.e. the Hamilton-Jacobi interpolation, with excellent agreement. Finally, we show the thermodynamical equivalence of this associative network with a (restricted) Boltzmann machine and study its stochastic dynamics to obtain even a dynamical picture, perfectly consistent with the static scenario earlier discussed. (c) 2012 Elsevier Ltd. All rights reserved.

,

We introduce a family of neural quantum states for the simulation of strongly interacting systems in the presence of spatial periodicity. Our variational state is parametrized in terms of a permutationally invariant part described by the Deep Sets neural-network architecture. The input coordinates to the Deep Sets are periodically transformed such that they are suitable to directly describe periodic bosonic systems. We show example applications to both one- and two-dimensional interacting quantum gases with Gaussian interactions, as well as to He-4 confined in a one-dimensional geometry. For the one-dimensional systems we find very precise estimations of the ground-state energies and the radial distribution functions of the particles. In two dimensions we obtain good estimations of the ground-state energies, comparable to results obtained from more conventional methods.