Mathematical Psychology
About

Sample Entropy in EEG

Sample entropy quantifies the regularity and complexity of physiological time series such as EEG, providing a bias-reduced measure of signal predictability used to characterize brain states, cognitive load, and neurological disorders.

SampEn(m, r, N) = −ln(A/B)

Sample entropy (SampEn), introduced by Richman and Moorman (2000) as an improvement on Pincus's (1991) approximate entropy (ApEn), measures the complexity and regularity of time series data by quantifying the conditional probability that sequences that are similar for m consecutive points remain similar when one additional point is included. Lower values indicate more regular, predictable signals; higher values indicate greater complexity and unpredictability. In neuroscience, SampEn has become a standard tool for analyzing EEG, characterizing brain states, and detecting neurological pathology.

Definition and Computation

Sample Entropy Given time series {x(1), x(2), ..., x(N)}:

B = proportion of template matches of length m (within tolerance r)
A = proportion of template matches of length m+1 (within tolerance r)

SampEn(m, r, N) = −ln(A / B)

Typical parameters for EEG: m = 2, r = 0.2 · SD(x)

The algorithm constructs template vectors of length m from the time series and counts the fraction B of template pairs that match within tolerance r. It then extends to templates of length m + 1 and counts the fraction A of matches. Sample entropy is the negative natural logarithm of the ratio A/B. Unlike approximate entropy, SampEn does not count self-matches, eliminating a systematic bias that caused ApEn to underestimate complexity, particularly for short time series.

EEG Applications

In EEG analysis, sample entropy captures aspects of neural dynamics that are invisible to spectral methods. During anesthesia, SampEn decreases systematically with depth of sedation, reflecting the transition from complex, irregular waking activity to regular, low-complexity anesthetized states. In sleep research, SampEn differentiates sleep stages, with REM sleep showing higher complexity than deep slow-wave sleep. In epilepsy, preictal decreases in SampEn have been identified as potential seizure predictors.

Multiscale Entropy

Costa, Goldberger, and Peng (2005) extended sample entropy to multiple time scales by applying it to coarse-grained versions of the original time series. This multiscale entropy (MSE) analysis reveals that healthy physiological systems show high complexity across a wide range of scales, while pathological states (e.g., atrial fibrillation, Alzheimer's disease) show reduced complexity at longer time scales. MSE provides a richer characterization than single-scale SampEn by capturing the hierarchical structure of neural dynamics.

Cognitive Applications

Sample entropy of EEG signals has been linked to cognitive variables. Working memory load typically increases frontal EEG entropy, consistent with the idea that higher cognitive demands recruit more complex neural dynamics. Age-related changes in EEG entropy have been documented across the lifespan: entropy increases during development (reflecting increasing neural complexity) and decreases in aging and dementia (reflecting loss of complexity). These findings connect the information-theoretic measure to the "complexity loss" hypothesis of aging and disease.

Despite its utility, SampEn has limitations. The choice of parameters m and r can influence results, and there is no universally accepted method for selecting them. Short time series (N < 200) can produce unreliable estimates. Multiscale entropy, permutation entropy, and other modern alternatives address some of these limitations while preserving the core information-theoretic framework of quantifying signal regularity and complexity.

Related Topics

References

  1. Richman, J. S., & Moorman, J. R. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology–Heart and Circulatory Physiology, 278(6), H2039–H2049. doi:10.1152/ajpheart.2000.278.6.H2039
  2. Pincus, S. M. (1991). Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences, 88(6), 2297–2301. doi:10.1073/pnas.88.6.2297
  3. Costa, M., Goldberger, A. L., & Peng, C.-K. (2005). Multiscale entropy analysis of biological signals. Physical Review E, 71(2), 021906. doi:10.1103/PhysRevE.71.021906
  4. Liang, Z., Wang, Y., Sun, X., Li, D., Voss, L. J., Sleigh, J. W., ... & Li, X. (2015). EEG entropy measures in anesthesia. Frontiers in Computational Neuroscience, 9, 16. doi:10.3389/fncom.2015.00016

External Links