##Statistics Seminar##\\ Department of Mathematical Sciences
^ **DATE:**|Thursday, August 29, 2019 |
^ **TIME:**|1:15pm -- 2:15pm |
^ **LOCATION:**|WH 100E |
^ **SPEAKER:**|Qiqing Yu, Binghamton University |
^ **TITLE:**| A Note On Application Of The Kullback-Leibler Information Inequality |
\\
**Abstract**
One often makes use of Shannon-Kolmogorov inequality in proving
the consistency of the maximum likelihood estimator
(MLE). The approach does not work when
E(\ln f(X)) does not exist, where f is the density function of the
random variable X.
We consider several parametric distribution families where
E(\ln f(X)) does not exist.
We make use of the
Kullback-Leibler (K-L) Information inequality
in proving that the MLE is consistent.