arXiv Analytics

Sign in

arXiv:2009.07268 [cs.DS]AbstractReferencesReviewsResources

An improved quantum-inspired algorithm for linear regression

András Gilyén, Zhao Song, Ewin Tang

Published 2020-09-15Version 1

We give a classical algorithm for linear regression analogous to the quantum matrix inversion algorithm [Harrow, Hassidim, and Lloyd, Physical Review Letters '09] for low-rank matrices [Chakraborty et al., ICALP'19], when the input matrix $A$ is stored in a data structure applicable for QRAM-based state preparation. Namely, if the input model supports efficient $\ell_2$-norm importance sampling, and given $A \in \mathbb{C}^{m\times n}$ with minimum singular value $\sigma$ and $b \in \mathbb{C}^m$ as input, we can output a description of an $x$ such that $\|x - A^+b\| \leq \varepsilon\|A^+b\|$ in $\tilde{\mathcal{O}}\left(\frac{\|A\|_{\mathrm{F}}^6\|A\|^2}{\sigma^8\varepsilon^4}\right)$ time, improving on previous "quantum-inspired" algorithms in this line of research by a factor of $\frac{\|A\|^{14}}{\sigma^{14}\varepsilon^2}$ [Chia et al., STOC'20]. The algorithm is stochastic gradient descent, and the analysis bears similarities to results of [Gupta and Sidford, NeurIPS'18]. Unlike earlier works, this is a promising avenue that could lead to feasible implementations of classical regression in a quantum-inspired setting, for comparison against future quantum computers.

Related articles: Most relevant | Search more
arXiv:2312.01547 [cs.DS] (Published 2023-12-04)
Near-Optimal Algorithms for Gaussians with Huber Contamination: Mean Estimation and Linear Regression
arXiv:1507.00056 [cs.DS] (Published 2015-06-30)
Private Approximations of the 2nd-Moment Matrix Using Existing Techniques in Linear Regression
arXiv:1507.02482 [cs.DS] (Published 2015-07-09)
Differentially Private Least Squares: Estimation, Confidence and Rejecting the Null Hypothesis