Batched quantum state exponentiation and quantum Hebbian learning
- PDF / 559,771 Bytes
- 10 Pages / 595.224 x 790.955 pts Page_size
- 24 Downloads / 159 Views
RESEARCH ARTICLE
Batched quantum state exponentiation and quantum Hebbian learning Thomas R. Bromley1 · Patrick Rebentrost1 Received: 28 November 2018 / Accepted: 28 February 2019 / Published online: 15 May 2019 © Springer Nature Switzerland AG 2019
Abstract Machine learning is a crucial aspect of artificial intelligence. This paper details an approach for quantum Hebbian learning through a batched version of quantum state exponentiation. Here, batches of quantum data are interacted with learning and processing quantum bits (qubits) by a series of elementary controlled partial swap operations, resulting in a Hamiltonian simulation of the statistical ensemble of the data. We decompose this elementary operation into one and two qubit quantum gates from the Clifford+T set and use the decomposition to perform an efficiency analysis. Our construction of quantum Hebbian learning is motivated by extension from the established classical approach, and it can be used to find details about the data such as eigenvalues through phase estimation. This work contributes to the near-term development and implementation of quantum machine learning techniques. Keywords Quantum learning · Quantum state exponentiation
1 Introduction Machine learning encompasses a series of techniques that allow computers to solve problems without explicity telling them how to do so (MacKay 2003). In supervised learning, the machine is first taught to solve the problem on a series of training data. This learning stage is a crucial element in determining the performance of a machine learning algorithm. One particularly fruitful machine learning technique is to construct an artificial neural network, represented by an interacting collection of binary valued neurons. The applications of machine learning are numerous and include, for example, finance, biotechnology, ecommerce, chemistry, insurance, and security. In particular, neural networks have been successfully used in finance for portfolio analysis (Barr and Mani 1998) and credit approval (Norris 1999), as well as in e-commerce for user ratings of online stores (Lasa and Berndt 2007).
Thomas R. Bromley
[email protected] Patrick Rebentrost [email protected] 1
Xanadu, 777 Bay Street, Toronto, Ontario, M5G 2C8, Canada
The Hebbian approach is the most natural learning method for neural networks that are fully visible and with undirected connections (Hebb 1949). Hebbian learning specifies the connection strength between neurons according to the number of times that they fire together within training data. The output of Hebbian learning is a real symmetric weighting matrix with zero diagonal, whose elements correspond to the connection weights between neuron pairs. This matrix is then used as a component within machine learning algorithms. Undirected and fully visible neural networks, such as the canonical Hopfield network (Hopfield 1982), are often utilized as an associative memory for pattern recognition, as well as for optimization problems such as the traveling salesman problem (MacKay 2003). Quantum ma
Data Loading...