SEMINAR

Functional Gradient Descent methods for optimization and sampling

Speaker

Qiang Liu

Working
The University of Texas
Timeline
Fri, Oct 22 2021 - 10:00 am (GMT + 7)
About Speaker

Qiang Liu is an Assistant Professor of Computer Science at University of Texas at Austin (UT). Dr. Liu leads the Statistical Learning & AI Group at UT, and has had several recent publications in advanced machine learning. His research group had four papers accepted at this year’s International Conference on Machine Learning, and two papers accepted at the International Conference on Learning Representations. His research focuses on Artificial Intelligence and Machine Learning, especially statistical learning methods for high-dimensional and complex data.

Abstract

Gradient descent provides a fundamental tool of optimization in machine learning.  However, the power of gradient descent is not limited to the typical Euclidean space. In this work, I will discuss several ideas of using generalized notion of gradient in infinite dimensional spaces for solving challenging optimization and optimization problems, including 1) Stein variational gradient descent, which provides a deterministic mechanism for drawing samples from intractable distributions using a functional steepest descent w.r.t. a RKHS-Wasserstein metric; 2) splitting steepest neural architecture descent which leverages a functional steepest procedure for joint estimation of neural network weights and structures; 3) and extensions of these ideas for handling constrained, bilevel and multi-objective optimization.

Related seminars

Anh Nguyen

Microsoft GenAI

The Revolution of Small Language Models
Fri, Mar 8 2024 - 02:30 pm (GMT + 7)

Thang D. Bui

Australian National University (ANU)

Recent Progress on Grokking and Probabilistic Federated Learning
Fri, Jan 26 2024 - 10:00 am (GMT + 7)

Tim Baldwin

MBZUAI, The University of Melbourne

LLMs FTW
Tue, Jan 9 2024 - 10:30 am (GMT + 7)

Quan Vuong

Google DeepMind

Scaling Robot Learning
Wed, Dec 27 2023 - 10:00 am (GMT + 7)