arXiv Analytics

Sign in

arXiv:1711.03962 [stat.ME]AbstractReferencesReviewsResources

Estimating the Entropy Rate of Finite Markov Chains with Application to Behavior Studies

Brian Vegetabile, Jenny Molet, Tallie Z. Baram, Hal Stern

Published 2017-11-10Version 1

Predictability of behavior has emerged an an important characteristic in many fields including biology, medicine, and marketing. Behavior can be recorded as a sequence of actions performed by an individual over a given time period. This sequence of actions can often be modeled as a stationary time-homogeneous Markov chain and the predictability of the individual's behavior can be quantified by the entropy rate of the process. This paper provides a comprehensive investigation of three estimators of the entropy rate of finite Markov processes and a bootstrap procedure for providing standard errors. The first two methods directly estimate the entropy rate through estimates of the transition matrix and stationary distribution of the process; the methods differ in the technique used to estimate the stationary distribution. The third method is related to the sliding-window Lempel-Ziv (SWLZ) compression algorithm. The first two methods achieve consistent estimates of the true entropy rate for reasonably short observed sequences, but are limited by requiring a priori specification of the order of the process. The method based on the SWLZ algorithm does not require specifying the order of the process and is optimal in the limit of an infinite sequence, but is biased for short sequences. When used together, the methods can provide a clear picture of the entropy rate of an individual's behavior.

Related articles:
arXiv:2212.06705 [stat.ME] (Published 2022-12-13)
Truly Bayesian Entropy Estimation