Classical benchmarking of Gaussian Boson Sampling on the Titan supercomputer
- PDF / 496,450 Bytes
- 14 Pages / 439.37 x 666.142 pts Page_size
- 12 Downloads / 195 Views
Classical benchmarking of Gaussian Boson Sampling on the Titan supercomputer Brajesh Gupt1 · Juan Miguel Arrazola1 · Nicolás Quesada1 Thomas R. Bromley1
·
Received: 19 November 2019 / Accepted: 31 May 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract Gaussian Boson Sampling (GBS) is a model of photonic quantum computing where single-mode squeezed states are sent through linear-optical interferometers and measured using single-photon detectors. In this work, we employ a recent exact sampling algorithm for GBS with threshold detectors to perform classical simulations on the Titan supercomputer. We determine the time and memory resources as well as the amount of computational nodes required to produce samples for different numbers of modes and detector clicks. It is possible to simulate a system with 800 optical modes postselected on outputs with 20 detector clicks, producing a single sample in roughly 2 h using 40% of the available nodes of Titan. Additionally, we benchmark the performance of GBS when applied to dense subgraph identification, even in the presence of photon loss. We perform sampling for several graphs containing as many as 200 vertices. Our findings indicate that large losses can be tolerated and that the use of threshold detectors is preferable over using photon-number-resolving detectors postselected on collision-free outputs. Keywords Boson sampling · Quantum benchmarking · Quantum computation · Quantum optics
1 Introduction The first generation of programmable quantum devices is emerging. This has led to an increased interest to understand their practical applications and their potential to surpass the capabilities of traditional computers [1,2]. Classical simulation algorithms
B
Nicolás Quesada [email protected] Brajesh Gupt [email protected]
1
Xanadu, Toronto, ON M5G 2C8, Canada 0123456789().: V,-vol
123
249
Page 2 of 14
B. Gupt et al.
play an important role in this development: They can be used to benchmark the correctness of quantum algorithms and to set the bar of performance for quantum computers [3–7]. In photonic quantum computing, Boson sampling is a sub-universal model where indistinguishable single photons are sent through linear optics interferometers and their output ports are recorded using single-photon detectors [8–11]. Despite its conceptual simplicity, it is believed that simulating the behavior of a boson sampling device requires exponential time on a classical computer [8,12]. This standard boson sampling model requires single-photon sources, which are challenging to realize experimentally at a large scale. Consequently, other variants of boson sampling have been proposed where the inputs are squeezed states, which are more amenable to implement in practice. Examples of these models include scattershot boson sampling [13–15] and Gaussian Boson Sampling (GBS) [16,17]. Notably, it has been shown that GBS has applications in quantum chemistry [18–20], optimization [21,22], and graph theory [23]. Alongside these theoretical and experim
Data Loading...