場所:東京都のイベント

[ABI team seminar] Talk by Prof. Molei Tao on \\"Sampling and optimization in non-Euclidean spaces\\"

イベント説明

Zoom link: Emtiyaz Khan is inviting you to a scheduled Zoom meeting.
https://riken-jp.zoom.us/j/99285949815?pwd=dm5Gc2JPZzNtUHpwT2txdTF6dGJKUT09
Meeting ID: 992 8594 9815
Passcode: LekHDapE9N
Title:
{ Sampling and optimization in non-Euclidean spaces }
Abstract:
{
Machine learning in non-Euclidean spaces have attracted significant attention in recent years, and this talk will give some examples of progress on its algorithmic foundations. I will first discuss how to sample from constrained distributions by combining tools from optimal transport, geometry, optimization, and numerical analysis. Such task is useful for, e.g., Bayesian inferences. Then I will present how to efficiently optimize functions defined on manifolds, along with some machine learning applications. More specifics follow -
Part I: Mirror descent is a popular method for optimization on a constrained set. Mirror Langevin Algorithm is an extension of mirror descent, switching from an optimization context to the task of sampling constrained (unnormalized) probability distributions. Its continuous time version, Mirror Langevin Dynamics, can be viewed as a special case of Riemannian Langevin Dynamics, which is a specific Riemannian Wasserstein gradient flow that modified the Euclidean geometry of Langevin by Hessian metric to enable rich applications. Its discretizations, on the other hand, correspond to interesting sampling algorithms. By developing a general tool for analyzing samplers based on SDE discretizations, termed as mean-square analysis for sampling, we established quantitative error bounds and analyzed their dimension dependence. Joint work with Andre Wibisono, Ruilin Li, and Santosh Vempala.
Part II will report the construction of momentum-accelerated algorithms that optimize functions defined on Riemannian manifolds, focusing on a particular case known as the Stiefel manifold. The treatment will be based on a tool known as variational optimization, as well as delicate interplays between continuous- and discrete-time dynamics again. Two practical applications will also be described: (1) we markedly improved the performance of trained-from-scratch Vision Transformer by appropriately wiring orthogonality into its self-attention mechanism, and (2) our optimizer also makes the useful notion of Projection Robust Wasserstein Distance for high-dim. optimal transport even more effective. Joint work with Lingkai Kong and Yuqing Wang.
If time permits, I will also briefly introduce how to generalize accelerated non-Euclidean optimization to sampling.
}
Bio:
{
Molei Tao received B.S. in Math & Physics from Tsinghua University, and Ph.D. in Control & Dynamical Systems with a minor in Physics from California Institute of Technology (Caltech). Afterwards, he worked as a postdoctoral researcher in Computing & Mathematical Sciences at Caltech from 2011 to 2012, and then as a Courant Instructor at New York University from 2012 to 2014. From 2014 on, he has been working as an assistant, and then associate professor in School of Mathematics at Georgia Institute of Technology (GT). He is also a core faculty member of {GT Machine Learning Center}, {Program in Algorithms, Combinatorics and Optimization}, and {Decision and Control Laboratory}. Recognitions he received included W.P. Carey Ph.D. Prize in Applied Mathematics (2011), American Control Conference Best Student Paper Finalist (2013), the NSF CAREER Award (2019), AISTATS best paper award (2020), IEEE EFTF-IFCS Best Student Paper Finalist (2021), Cullen-Peck Scholar Award (2022), and GT-Emory AI.Humanity Award (2023).
}

開催日

2023年8月18日11:00 ~ 2023年8月18日12:30

主催者・問い合わせ先

RIKEN AIP Public

開催場所

項目内容
場所名称未設定
住所Open area at AIP Nihombashi, but can also attend through Zoom (see the description for a link)

開催場所の地図

SNS・Bookmark

B!

近隣のイベント

近隣の場所 (直線距離)