arXiv Analytics

Sign in

arXiv:1705.07048 [cs.LG]AbstractReferencesReviewsResources

Linear regression without correspondence

Daniel Hsu, Kevin Shi, Xiaorui Sun

Published 2017-05-19Version 1

This article considers algorithmic and statistical aspects of linear regression when the correspondence between the covariates and the responses is unknown. First, a fully polynomial-time approximation scheme is given for the natural least squares optimization problem in any constant dimension. Next, in an average-case and noise-free setting where the responses exactly correspond to a linear function of i.i.d. draws from a standard multivariate normal distribution, an efficient algorithm based on lattice basis reduction is shown to exactly recover the unknown linear function in arbitrary dimension. Finally, lower bounds on the signal-to-noise ratio are established for approximate recovery of the unknown linear function by any estimator.

Related articles: Most relevant | Search more
arXiv:1904.08544 [cs.LG] (Published 2019-04-18)
Memory-Sample Tradeoffs for Linear Regression with Small Error
arXiv:1206.4678 [cs.LG] (Published 2012-06-18)
Linear Regression with Limited Observation
arXiv:1912.03036 [cs.LG] (Published 2019-12-06)
Improved PAC-Bayesian Bounds for Linear Regression