arXiv Analytics

Sign in

arXiv:1008.5325 [cs.LG]AbstractReferencesReviewsResources

Inference with Multivariate Heavy-Tails in Linear Models

Danny Bickson, Carlos Guestrin

Published 2010-08-31, updated 2011-03-21Version 4

Heavy-tailed distributions naturally occur in many real life problems. Unfortunately, it is typically not possible to compute inference in closed-form in graphical models which involve such heavy-tailed distributions. In this work, we propose a novel simple linear graphical model for independent latent random variables, called linear characteristic model (LCM), defined in the characteristic function domain. Using stable distributions, a heavy-tailed family of distributions which is a generalization of Cauchy, L\'evy and Gaussian distributions, we show for the first time, how to compute both exact and approximate inference in such a linear multivariate graphical model. LCMs are not limited to stable distributions, in fact LCMs are always defined for any random variables (discrete, continuous or a mixture of both). We provide a realistic problem from the field of computer networks to demonstrate the applicability of our construction. Other potential application is iterative decoding of linear channels with non-Gaussian noise.

Comments: In Neural Information Processing System (NIPS) 2010, Dec. 2010, Vancouver, Canada
Categories: cs.LG, cs.IT, math.IT
Related articles: Most relevant | Search more
arXiv:2211.15661 [cs.LG] (Published 2022-11-28, updated 2022-11-29)
What learning algorithm is in-context learning? Investigations with linear models
arXiv:1811.01564 [cs.LG] (Published 2018-11-05)
Parallel training of linear models without compromising convergence
arXiv:2106.15093 [cs.LG] (Published 2021-06-29)
Certifiable Machine Unlearning for Linear Models