Activities
Student Organizations
Math Club
BingAWM
Actuarial Association
Data Science Seminar
Hosted by the Department of Mathematics and Statistics
Abstract
In this talk, I will delve into our analysis of stochastic
gradient methods (SGMs), focusing on the interplay between
generalization and optimization within the framework of
statistical learning theory (SLT) and discuss their applications.
The core concept for our study is algorithmic stability which is a
notion in SLT to characterize how the output of an ML
algorithm changes upon a small perturbation of the training data.
Our theoretical studies significantly improved the existing
results in the convex case and led to new insights into
understanding the generalization of deep neural networks trained
by SGD in the non-convex case. I will also discuss how to derive
lower bounds for the convergence of existing AUC optimization
algorithms which further inspires a new direction for designing efficient
algorithms. Additionally, I will touch on extensions to differential
privacy and minimax problems.
Biography of the speaker: Dr. Ying is a Professor in the Department of Mathematics and Statistics at UAlbany and the founding director of the machine learning lab (ML@UA). With a Ph.D. in mathematics from Zhejiang University, China (2002), he completed postdoctoral training in applied math and machine learning in Hong Kong and the UK. Dr. Ying's research spans Statistical Learning Theory, Trustworthy Machine Learning, and Optimization. He is the recipient of the SUNY Chancellor’s Award for Excellence in Scholarship and Creative Activities (2023) and the University of Exeter Merit Award (2012). He currently holds editorial roles at Transactions on Machine Learning Research, and Neurocomputing and is the managing editor for Mathematical Foundation of Computing. Additionally, he serves as an Area Chair for leading machine learning conferences including NeurIPS, ICML, and AISTATS.