**Problem of the Week**

**Math Club**

**BUGCAT 2020**

**Zassenhaus Conference**

**Hilton Memorial Lecture**

**BingAWM**

seminars:stat:161006

Statistics Seminar

Department of Mathematical Sciences

DATE: | Thursday, October 6, 2016 |
---|---|

TIME: | 1:15p-2:40p |

LOCATION: | WH 100E |

SPEAKER: | Pang Du, Virginia Tech |

TITLE: | Optimal penalized function-on-function regression under a reproducing kernel Hilbert space framework |

**Abstract**

Many studies collect data with response and predictor variables both being functions of some covariate. Their common goal is to understand the relationship between these functional variables. Motivated from two real-life examples, we propose a new function-on-function regression model that can be used to analyze such kind of functional data. Our estimator of the 2D coefficient function is the optimizer of a form of penalized least squares where the penalty enforces certain level of smoothness on the estimator. Our first result is the Representer Theorem which states that the exact optimizer of the penalized least squares actually resides in a data-adaptive finite dimensional subspace although the optimization problem is defined on a function space of infinite dimensions. This theorem then allows us an easy incorporation of the Gaussian quadrature into the optimization of the penalized least squares, which can be carried out through standard numerical procedures. We also show that our estimator achieves the minimax convergence rate in mean prediction under the framework of function-on-function regression. Extensive simulation studies demonstrate the numerical advantages of our method over the existing ones. The method is then applied to the benchmark Canadian weather data and a histone regulation study.

seminars/stat/161006.txt · Last modified: 2016/09/28 16:18 by shang

Except where otherwise noted, content on this wiki is licensed under the following license: CC Attribution-Noncommercial-Share Alike 3.0 Unported