Skip to content
Title: Robust Deep Learning under Distribution Shift
Speaker: Dr Yingyu Liang, Assistant Professor,  Department of Computer Science, University of Wisconsin-Madison
Date: June 29, 2023
Time: 10:30am – 11:30am
Venue: Tam Wing Fan Innovation Wing II / Zoom

Mode: Hybrid. Seats for on-site participants are limited. A confirmation email will be sent to participants who have successfully registered.


Deep learning has achieved remarkable success in various application domains such as computer vision, natural language processing, and game playing. However, this success is based on the assumption that the test data distribution is identical to the training data distribution. In practice, this assumption usually does not hold, leading to distribution shift. As a result, deep neural networks often suffer a significant drop in their performance under distribution shift. There are two kinds of distribution shifts: one occurs naturally during the data collection process while the other is constructed by some adversaries. I will discuss our recent research on addressing these two kinds of distribution shifts. Specifically, I will talk about how to estimate the generalization of deep neural networks in test time under distribution shift and how to use selective prediction to enhance adversarial robustness.


Dr Yingyu Liang
Assistant Professor @ Department of Computer Science, University of Wisconsin-Madison; Recipient of the NSF CAREER Award

Yingyu Liang is an Assistant Professor at the University of Wisconsin Madison. He received his Ph.D. from the Georgia Institute of Technology and was a postdoctoral researcher at Princeton University. His research aims at providing theoretical foundations for modern machine learning models and designing effective algorithms for real-world applications. Recent focuses include optimization and generalization in deep learning, robust machine learning, and their applications. He is a recipient of the NSF CAREER award.

Dr Liang’s full profile can be accessed here:

For information, please contact: