Posted: Apr 27, 2023
The rapid advancement of machine learning systems has brought increasing attention to the critical issue of predictive uncertainty quantification. As these systems are deployed in high-stakes domains such as healthcare, autonomous vehicles, and financial systems, understanding and properly characterizing model uncertainty becomes paramount for safety, reliability, and trustworthiness. Traditional approaches to uncertainty quantification in machine learning have largely developed independently from information-theoretic foundations, despite the conceptual similarities between statistical entropy and predictive uncertainty measures. Statistical entropy, as formalized by Claude Shannon in 1948, provides a fundamental measure of uncertainty or randomness in probability distributions. In information theory, entropy quantifies the average level of information inherent in a random variable's possible outcomes. Meanwhile, in machine learning, predictive uncertainty typically distinguishes between epistemic uncertainty (resulting from limited knowledge or data) and aleatoric uncertainty (inherent randomness in the data generation process). While both concepts address forms of uncertainty, their mathematical and conceptual relationships remain inadequately explored in contemporary research. This paper addresses this gap by developing a unified framework that explicitly connects statistical entropy with machine learning predictive uncertainty. We propose that entropy measures can serve as a foundational principle for understanding, quantifying, and decomposing different types of uncertainty in predictive models. Our approach moves beyond conventional uncertainty quantification methods by leveraging the rich theoretical foundation of information theory to provide new insights into model behavior, calibration, and generalization. We formulate three primary research questions: First, how can statistical entropy be mathematically related to established measures of epistemic and aleatoric uncertainty? Second, what practical benefits does an entropy-based uncertainty framework offer for model evaluation and improvement? Third, [text continues but is cut off]
Downloads: 54
Abstract Views: 1254
Rank: 244083