The Bayesian information criterion: background, derivation, and applications
The Bayesian information criterion: background, derivation, and applications
Key takeaways
Bibliography: Neath, A.A., Cavanaugh, J.E., 2012. The Bayesian information criterion: background, derivation, and applications. WIREs Computational Stats 4, 199–203. https://doi.org/10.1002/wics.199
Authors:: Andrew A. Neath, Joseph E. Cavanaugh
Collections:: Methods
First-page: 199
Abstract The Bayesian information criterion (BIC) is one of the most widely known and pervasively used tools in statistical model selection. Its popularity is derived from its computational simplicity and effective performance in many modeling frameworks, including Bayesian applications where prior distributions may be elusive. The criterion was derived by Schwarz ( Ann Stat 1978, 6:461–464) to serve as an asymptotic approximation to a transformation of the Bayesian posterior probability of a candidate model. This article reviews the conceptual and theoretical foundations for BIC, and also discusses its properties and applications. WIREs Comput Stat 2012, 4:199–203. doi: 10.1002/wics.199 This article is categorized under: Statistical and Graphical Methods of Data Analysis > Bayesian Methods and Theory Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods
content: "@neathBayesianInformationCriterion2012" -file:@neathBayesianInformationCriterion2012
Reading notes
Imported on 2024-05-06 13:44
💡 Main ideas and conclusions
- $ troduce BIC, consider the following model selection framework. Suppose we endeavor to find a suitable model to describe (p. 199)