Misplaced Pages

Hannan–Quinn information criterion

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In statistics, the Hannan–Quinn information criterion (HQC) is a criterion for model selection. It is an alternative to Akaike information criterion (AIC) and Bayesian information criterion (BIC). It is given as

H Q C = 2 L m a x + 2 k ln ( ln ( n ) ) ,   {\displaystyle \mathrm {HQC} =-2L_{max}+2k\ln(\ln(n)),\ }

where L m a x {\displaystyle L_{max}} is the log-likelihood, k is the number of parameters, and n is the number of observations.

Burnham & Anderson (2002, p. 287) say that HQC, "while often cited, seems to have seen little use in practice". They also note that HQC, like BIC, but unlike AIC, is not an estimator of Kullback–Leibler divergence. Claeskens & Hjort (2008, ch. 4) note that HQC, like BIC, but unlike AIC, is not asymptotically efficient; however, it misses the optimal estimation rate by a very small ln ( ln ( n ) ) {\displaystyle \ln(\ln(n))} factor. They further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term ln ( ln ( n ) ) {\displaystyle \ln(\ln(n))} , since this latter number is small even for very large n {\displaystyle n} ; however, the ln ( ln ( n ) ) {\displaystyle \ln(\ln(n))} term ensures that, unlike AIC, HQC is strongly consistent. It follows from the law of the iterated logarithm that any strongly consistent method must miss efficiency by at least a ln ( ln ( n ) ) {\displaystyle \ln(\ln(n))} factor, so in this sense HQC is asymptotically very well-behaved. Van der Pas and Grünwald prove that model selection based on a modified Bayesian estimator, the so-called switch distribution, in many cases behaves asymptotically like HQC, while retaining the advantages of Bayesian methods such as the use of priors etc.

See also

References

Categories: