Date:
Wednesday, April 6, 2022 – 12:15 to 13:15
Location:
https://gatech.zoom.us/j/95585693675
Summary Sentence:
In this talk, we will first briefly review recent works showing that zero regularization, or fitting of noise, need not be harmful in regression tasks.
Contact:
Lia Namirr
Machine Learning Center at Georgia Tech
Abstract: Seemingly counter-intuitive phenomena in deep neural networks have prompted a recent re-investigation of classical machine learning methods, like linear models and kernel methods. Of particular focus is sufficiently high-dimensional setups in which interpolation of training data is possible. In this talk, we will first briefly review recent works showing that zero regularization, or fitting of noise, need not be harmful in regression tasks. Then, we will use this insight to uncover two new surprises for high-dimensional linear classification:
Event Categories:
Invited Audience: