arXiv Analytics

Sign in

arXiv:1507.04783 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Maximum entropy method: sampling bias

Jorge Fernandez-de-Cossio, Jorge Fernandez-de-Cossio Diaz

Published 2015-07-16Version 1

Maximum entropy method is a constructive criterion for setting up a probability distribution maximally non-committal to missing information on the basis of partial knowledge, usually stated as constrains on expectation values of some functions. In connection with experiments sample average of those functions are used as surrogate of the expectation values. We address sampling bias in maximum entropy approaches with finite data sets without forcedly equating expectation values to corresponding experimental average values. Though we rise the approach in a general formulation, the equations are unfortunately complicated. We bring simple case examples, hopping clear but sufficient illustration of the concepts.

Related articles:
Determination of the full statistics of quantum observables using the maximum entropy method
arXiv:0902.1787 [cond-mat.stat-mech] (Published 2009-02-11, updated 2009-10-31)
Generalized Fisher information matrix in nonextensive systems with spatial correlation
arXiv:cond-mat/0410098 (Published 2004-10-04)
Analytic continuation of QMC data with a sign problem