报告题目：Boosting with Structural Sparsity: A Differential Inclusion Approach
报告摘要：Boosting, as gradient descent method, is arguably the 'best off-the-shelf' methods in machine learning. Here a novel Boosting-type algorithm is proposed based on restricted gradient descent. In fact, we present an iterative regularization path with structural sparsity where the parameter is sparse under some linear transforms, based on the Linearized Bregman Iteration or sparse mirror descent. Despite its simplicity, it outperforms the popular (generalised) Lasso in both theory and experiments. A theory of path consistency is presented that equipped with a proper early stopping, it may achieve model selection consistency. A data adaptive early stopping rule is developed based on Knockoff method which aims to control the false discovery rates of causal variables. The utility and benefit of the algorithm are illustrated by various applications including sparse variable selection, learning graphical models, etc.
Yuan YAO is currently Associate Professor of Mathematics and Chemical & Biological Engineering in HKUST. Dr. Yao received his PhD in Mathematics from UC Berkeley with Prof. Steve Smale and worked in Stanford University and Peking University before joining HKUST in 2016. His main research interests lie in mathematics of data science and machine learning, with applications in computational biology and information technology.