Richard M. Golden, working at the University of Texas at Dallas, has made contributions at the intersection of mathematical psychology, neural network theory, and statistical methodology. His work spans the mathematical analysis of neural network learning, the development of statistical methods for evaluating cognitive models, and the application of information-theoretic principles to understanding cognitive processes.
Statistical Methods for Model Selection
L = likelihood function
J = expected Hessian (sensitivity matrix)
K = expected outer product of gradients (variability matrix)
When model is correctly specified: GIC reduces to AIC
Golden has developed generalized information-theoretic criteria for model selection that extend the Akaike Information Criterion (AIC) to situations where the candidate models may be misspecified -- a common situation in cognitive modeling where all models are known to be approximations. These methods provide robust model comparison tools that account for both model fit and complexity without assuming that any candidate model is the true data-generating process.
Golden's work on the mathematical analysis of neural network models has clarified the conditions under which networks converge to optimal solutions, the relationship between network architecture and representational capacity, and the statistical properties of network learning algorithms. This work provides rigorous mathematical foundations for connectionist models that are often analyzed primarily through simulation.
Information Theory in Cognition
Golden has applied information-theoretic methods to problems in cognitive modeling, including the development of entropy-based measures for assessing model adequacy and the use of Kullback-Leibler divergence for quantifying the information loss when a model approximates the true data-generating process. These methods connect the practical task of model evaluation to deep theoretical concepts about the nature of information and statistical inference.
Legacy and Impact
Golden's contributions span the theoretical foundations of cognitive modeling. His textbook Statistical and Mathematical Methods for Data Analysis has provided researchers with rigorous treatments of the statistical methods needed for evaluating formal cognitive models. His work exemplifies the importance of sound statistical methodology in mathematical psychology, ensuring that conclusions drawn from model fitting are justified by the data.