##Statistics Seminar##\\ Department of Mathematical Sciences ~~META:title =April 16, 2015~~ ^ **DATE:**|Thursday, April 16, 2015 | ^ **TIME:**|1:15pm to 2:15pm | ^ **LOCATION:**|WH 100E | ^ **SPEAKER:**|Ruiqi Liu (Binghamton University) | ^ **TITLE:**|Density estimation for power transformations—Paper Discussion | \\ **Abstract** I will discuss a paper of Olga Y. Savchuk and Anton Schick. Consider a random sample $X_1,...,X_n$ from a density $f$. For a positive $\alpha$, the density $g$ of $t(X_1) = |X_1|^\alpha sign(X_1)$ can be estimated in two ways: by a kernel estimator based on the transformed data $t(X_1),...,t(X_n)$ or by a plug- in estimator transformed from a kernel estimator based on the original data. In this paper, they compare the performance of these two estimators using MSE and MISE. For MSE, the plug-in estimator is better in the case $\alpha > 1$ when $f$ is symmetric and unimodal, and in the case $\alpha \ge 2.5$ when $f$ is right- skewed and/or bimodal. For $\alpha < 1$, the plug-in estimator performs better around the modes of $g$, while the transformed data estimator is better in the tails of $g$. For global comparison MISE, the plug-in estimator has a faster rate of convergence for $0.4 \le \alpha < 1$ and $1 < \alpha < 2$. For $\alpha < 0.4$, the plug-in estimator is preferable for a symmetric density $f$ with exponentially decaying tails, while the transformed data estimator has a better performance when $f$ is right-skewed or heavy-tailed. Applications to real and simulated data illustrated these theoretical findings.