arXiv Analytics

Sign in

arXiv:2005.10811 [quant-ph]AbstractReferencesReviewsResources

A deep learning model for noise prediction on near-term quantum devices

Alexander Zlokapa, Alexandru Gheorghiu

Published 2020-05-21Version 1

We present an approach for a deep-learning compiler of quantum circuits, designed to reduce the output noise of circuits run on a specific device. We train a convolutional neural network on experimental data from a quantum device to learn a hardware-specific noise model. A compiler then uses the trained network as a noise predictor and inserts sequences of gates in circuits so as to minimize expected noise. We tested this approach on the IBM 5-qubit devices and observed a reduction in output noise of 12.3% (95% CI [11.5%, 13.0%]) compared to the circuits obtained by the Qiskit compiler. Moreover, the trained noise model is hardware-specific: applying a noise model trained on one device to another device yields a noise reduction of only 5.2% (95% CI [4.9%, 5.6%]). These results suggest that device-specific compilers using machine learning may yield higher fidelity operations and provide insights for the design of noise models.

Comments: 5 pages, 4 figures, 1 table. Comments welcome
Categories: quant-ph
Related articles: Most relevant | Search more
arXiv:2003.05244 [quant-ph] (Published 2020-03-11)
Optimizing High-Efficiency Quantum Memory with Quantum Machine Learning for Near-Term Quantum Devices
arXiv:1909.04786 [quant-ph] (Published 2019-09-10)
Noise-robust exploration of quantum matter on near-term quantum devices
arXiv:1907.08518 [quant-ph] (Published 2019-07-19)
Mitigation of readout noise in near-term quantum devices by classical post-processing based on detector tomography