Mathematical Psychology
About

Mutual Information

Mutual information measures the statistical dependence between two random variables, quantifying how much knowing one variable reduces uncertainty about the other.

I(X;Y) = H(X) + H(Y) − H(X,Y)

Mutual information (MI) is a fundamental quantity in information theory that measures the amount of information one random variable contains about another. Unlike linear correlation, MI captures all types of statistical dependencies, including nonlinear relationships, making it particularly valuable for analyzing neural coding and stimulus-response relationships.

Mutual Information I(X;Y) = H(X) + H(Y) − H(X,Y)
= H(X) − H(X|Y)
= ΣΣ p(x,y) · log₂[p(x,y) / (p(x)·p(y))]

Properties and Interpretation

MI is always non-negative and equals zero if and only if X and Y are statistically independent. It is symmetric: I(X;Y) = I(Y;X). Normalizing MI by dividing by the geometric mean of the marginal entropies gives a value between 0 and 1, facilitating comparison across different variable pairs.

Applications in Psychology and Neuroscience

In neuroscience, MI quantifies how much information neural responses carry about stimuli, providing a measure of neural coding efficiency. In psychophysics, MI between stimulus categories and responses provides a criterion-free measure of information transmission that complements SDT measures. In language research, MI between adjacent words (pointwise MI) is used to identify collocations and measure word associations.

Interactive Calculator

Each row provides a joint observation: x (stimulus category) and y (response category). The calculator computes mutual information I(X;Y) = H(X) + H(Y) − H(X,Y) from the observed frequencies.

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Related Topics

References

  1. Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27(3), 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  2. Borst, A., & Theunissen, F. E. (1999). Information theory and neural coding. Nature Neuroscience, 2(11), 947–957. https://doi.org/10.1038/14731
  3. Church, K. W., & Hanks, P. (1990). Word association norms, mutual information, and lexicography. Computational Linguistics, 16(1), 22–29. https://doi.org/10.5555/89086.89095
  4. Kinney, J. B., & Atwal, G. S. (2014). Equitability, mutual information, and the maximal information coefficient. Proceedings of the National Academy of Sciences, 111(9), 3354–3359. https://doi.org/10.1073/pnas.1309933111

External Links