Topic: | Optimal Sparse Regression Learning and Model Compression |
Date: | 30/05/2023 |
Time: | 2:30 pm - 3:30 pm |
Venue: | LT3, Lady Shaw Building, The Chinese University of Hong Kong |
Category: | Distinguished Lecture |
Speaker: | Professor Yuhong YANG |
PDF: | R20230530-DL-Yang-A3V2.pdf |
Details: | Abstract: Minimax-rate optimality plays a foundational role in theory of statistical/machine learning. In the context of regression, some key questions are: i) What determines the minimax-rate of convergence for regression estimation? ii) Is it possible to construct estimators that are simultaneously minimax optimal for a countable list of function classes? iii) In high-dimensional linear regression, how does different kinds of sparsity affect the rate of convergence? iv) How do we know if a pre-trained deep neural network model is compressible? If so, by how much? In this talk, we will address the above questions. After reviewing on the determination of minimax rate of convergence, we will present on minimax optimal adaptive estimation for high-dimensional regression learning under both hard and soft sparsity setups, taking advantage of sharp sparse linear approximation bounds. An application on model compression in neural network learning will be given. |