Friday, Oct 22 2021 - 10:00 am (GMT + 7)

Functional Gradient Descent methods for optimization and sampling

About the speaker

Qiang Liu is an Assistant Professor of Computer Science at University of Texas at Austin (UT). Dr. Liu leads the Statistical Learning & AI Group at UT, and has had several recent publications in advanced machine learning. His research group had four papers accepted at this year’s International Conference on Machine Learning, and two papers accepted at the International Conference on Learning Representations. His research focuses on Artificial Intelligence and Machine Learning, especially statistical learning methods for high-dimensional and complex data.

Abstract

Gradient descent provides a fundamental tool of optimization in machine learning.  However, the power of gradient descent is not limited to the typical Euclidean space. In this work, I will discuss several ideas of using generalized notion of gradient in infinite dimensional spaces for solving challenging optimization and optimization problems, including 1) Stein variational gradient descent, which provides a deterministic mechanism for drawing samples from intractable distributions using a functional steepest descent w.r.t. a RKHS-Wasserstein metric; 2) splitting steepest neural architecture descent which leverages a functional steepest procedure for joint estimation of neural network weights and structures; 3) and extensions of these ideas for handling constrained, bilevel and multi-objective optimization.

Upcoming Speakers

Guy Van den Broeck

University of California, Los Angeles

Tractable Probabilistic Circuits

Friday, May 27 2022 - 10:00 am (GMT + 7)

Nitesh Chawla

University of Notre Dame

Learning on Graphs: From Representation to Minimally Supervised

Thursday, Jun 02 2022 - 10:00 am (GMT + 7)