|Topic:||Gradient Boosting: Overview, Theory and Applications to Big Data Analytics|
|Time:||2:30 pm - 3:30 pm|
|Venue:||T. Y. Wong Hall, Ho Sin-Hang Engineering Building, The Chinese University of Hong Kong|
|Speaker:||Professor Tze Leung LAI|
We begin with a review of the history of gradient boosting, dating back to the LMS algorithm of Widrow and Hoff in 1960 and culminating in Freund and Schapire’s AdaBoost and Friedman’s gradient boosting and stochastic gradient boosting algorithms in the period 1999-2002 that heralded the big data era. The role played by gradient boosting in big data analytics, particularly with respect to deep learning, is then discussed. We also present some recent work on the mathematical theory of gradient boosting, which has led to refinements that greatly improve the convergence properties and prediction performance of the methodology.