User Tools

Site Tools


Data Science Seminar
Hosted by Department of Mathematical Sciences

  • Date: Tuesday, November 13, 2018
  • Time: 12:00pm – 1:00pm
  • Room: WH-100E
  • Speaker: Yudong Chen (Cornell University)
  • Title: Byzantine-Robust Distributed Learning with Non-converxity


In large-scale distributed learning, security issues have become increasingly important. Particularly in a decentralized environment, some computing units may behave abnormally, or even exhibit Byzantine failures–arbitrary and potentially adversarial behavior. In this talk, we develop ByzantinePGD, a distributed learning algorithm that is provably robust against such failures. At the core of ByzantinePGD is an efficient procedure for optimizing a non-convex loss function given only corrupted information of the gradient. We show that our algorithm converges to a local optimum of the population loss, escaping all saddle points. Under statistical settings of the data, we further show that our algorithm achieves near-optimal statistical error rates in both low and high dimensional regimes.

Bio: Yudong Chen is an assistant professor at the School of Operations Research and Information Engineering (ORIE), Cornell University. Before joining Cornell, he was a postdoctoral scholar at the Department of Electrical Engineering and Computer Sciences at University of California, Berkeley. He obtained his Ph.D. in Electrical and Computer Engineering from the University of Texas at Austin, and his M.S. and B.S. from Tsinghua University. His work has received the NSF CRII award and the second place in the INFORMS Nicholson Student Paper Competition. His research interests include machine learning, high-dimensional and robust statistics, convex and non-convex optimization, and applications in communication and computer networks.

seminars/datasci/181113.txt · Last modified: 2018/11/06 22:09 by qiao