arXiv Analytics

Sign in

arXiv:0902.1787 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Generalized Fisher information matrix in nonextensive systems with spatial correlation

Hideo Hasegawa

Published 2009-02-11, updated 2009-10-31Version 2

By using the $q$-Gaussian distribution derived by the maximum entropy method for spatially-correlated $N$-unit nonextensive systems, we have calculated the generalized Fisher information matrix of $g_{\theta_n \theta_m}$ for $(\theta_1, \theta_2, \theta_3) = (\mu_q, \sigma_q^2$, $s$), where $\mu_q$, $\sigma_q^2$ and $s$ denote the mean, variance and degree of spatial correlation, respectively, for a given entropic index $q$. It has been shown from the Cram\'{e}r-Rao theorem that (1) an accuracy of an unbiased estimate of $\mu_q$ is improved (degraded) by a negative (positive) correlation $s$, (2) that of $\sigma_q^2$ is worsen with increasing $s$, and (3) that of $s$ is much improved for $s \simeq -1/(N-1)$ or $s \simeq 1.0$ though it is worst at $s = (N-2)/2(N-1)$. Our calculation provides a clear insight to the long-standing controversy whether the spatial correlation is beneficial or detrimental to decoding in neuronal ensembles. We discuss also a calculation of the $q$-Gaussian distribution, applying the superstatistics to the Langevin model subjected to spatially-correlated inputs.

Comments: 18 pages, 3 figures: revised version accepted in Phys. Rev. E
Journal: Phys. Rev. E 80 (2009) 051125.
Related articles: Most relevant | Search more
Connection between Phantom and Spatial Correlation in the Kolmogorov-Johnson-Mehl-Avrami-model: A brief review
Maximum entropy method: sampling bias
Determination of the full statistics of quantum observables using the maximum entropy method