Jump to content

Watanabe–Akaike information criterion

From Wikipedia, the free encyclopedia

In statistics, the widely applicable information criterion (WAIC), also known as Watanabe–Akaike information criterion, is the generalized version of the Akaike information criterion (AIC) onto singular statistical models.[1]

Widely applicable Bayesian information criterion (WBIC) is the generalized version of Bayesian information criterion (BIC) onto singular statistical models.[2]

WBIC is the average log likelihood function over the posterior distribution with the inverse temperature > 1/log n where n is the sample size.[2]

Both WAIC and WBIC can be numerically calculated without any information about a true distribution.

See also

[edit]

References

[edit]
  1. ^ Watanabe, Sumio (2010). "Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Information Criterion in Singular Learning Theory". Journal of Machine Learning Research. 11: 3571–3594.
  2. ^ a b Watanabe, Sumio (2013). "A Widely Applicable Bayesian Information Criterion" (PDF). Journal of Machine Learning Research. 14: 867–897.