Suppose is the true density of a random sample while is the assumed model. The Kullback-Leibler distance is defined as

As we will show below the Kullback-Leibler information has a very useful property.

We know that , . Hence

so

notice that the rhs of the inequality can be rewritten as which (since a density integrates to one) is equal to

where

is the Hellinger metric.

We have thus just proved that

Hence **the** **convergence of the Kullback-Leibler information always yields consistency in the Hellinger metric**.

——

**References**:

van de Geer, S. (2000). Empirical Processes in M-Estimation. Cambridge University Press

van der Vaart (2000). Asymptotic Statistics. Cambridge University Press

S. Kullback and R. A. Leibler. On Information and Sufficiency. Ann. Math. Statist. Volume 22, Number 1 (1951), 79-86.

Advertisements

Pingback: Akaike Information Criterion Statistics | econostatistician