Foundations and Frontiers: Interdisciplinary Perspectives on Mathematical Optimization — Joint Event by the Continuous Optimization Team and Functional Analytic Learning Team
イベント説明
Part I: Continuous Optimization Team Presentation
10:00–10:30 Akiko Takeda
Title: Introduction of Continuous Optimization Team
Abstract: Our team, which was established in September 2016, is about to enter its 10th year this September. We will review some of our past research results and discuss future prospects.
10:30–11:00 Pierre-Louis Poirion
Title: Random Subspace Newton and Quasi-Newton Algorithms
Abstract: We present a randomized subspace regularized Newton method for a non-convex optimization. We will study global and local convergence properties of the method and prove that it works particularly well in a low rank setting. We will also present a randomized quasi-Newton method.
11:00–11:30 Jan Harold Alcantara
Title: Recent Developments in Splitting Algorithms for Nonconvex Optimization and Nonmonotone Inclusion
Abstract: Modern optimization methods increasingly leverage splitting algorithms to exploit problem structure and enable more efficient computation. In this talk, we review recent developments in the analysis of such methods within nonsmooth and nonconvex optimization, with a focus on structured nonconvex problems. We present global subsequential convergence guarantees under specific assumptions. We then extend this perspective to a broader class of problems—namely, multi-operator nonmonotone inclusion problems. In particular, we show how the Douglas–Rachford algorithm can be generalized to this multi-operator setting, and we establish conditions under which convergence can still be rigorously ensured.
11:30–11:45 Coffee Break
11:45–12:30 Christophe Roux
Title: Implicit Riemannian Optimism with Applications to Min-Max Problems
Abstract: Many optimization problems such as eigenvalue problems, principal component analysis and low-rank matrix completion can be interpreted as optimization problems over Riemannian manifolds, which allows for exploiting the geometric structure of the problems. While Riemannian optimization has been studied extensively in the offline setting, the online setting is not well understood. A major challenge in prior works was handling in-manifold constraints that arise in the online setting. We leverage implicit methods to address this problem and improve over existing results, removing strong assumptions and matching the best known regret bounds in the Euclidean setting. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.
13:30–14:30 Andreas Themelis
Title: It's All in the Envelope! A Smoother Approach to Splitting Algorithms
Abstract: Splitting algorithms, such as the proximal gradient method, ADMM, and Douglas-Rachford splitting, are fundamental tools for solving structured optimization problems by decomposing them into simpler, more manageable subproblems. Because of their simplicity and modularity, a lot of research has been devoted to understanding and possibly improving their convergence behavior, especially in nonconvex settings.
This talk offers a walkthrough on the use of "proximal envelopes" as a unifying framework for analyzing splitting methods. Much like the Moreau envelope gives a smooth interpretation of the proximal point method for convex problems, these envelope functions allow us to view various splitting algorithms, even in absence of convexity, through the lens of a "nonsmooth" gradient descent applied to a more regular surrogate. This perspective not only aids in theoretical analysis and convergence guarantees, but also paves the way to "acceleration" techniques that preserve the structure and simplicity of the original methods.
14:50–15:50 Benjamin Poignard
Title: Sparse Factor Models of High Dimension
Abstract: We consider the estimation of a sparse factor model where the factor loading matrix is assumed sparse. The estimation problem is reformulated as a penalized M-estimation criterion, while the restrictions for identifying the factor loading matrix accommodate a wide range of sparsity patterns. We prove the sparsistency property of the penalized estimator when the number of parameters is diverging, that is the consistency of the estimator and the recovery of the true zeros entries. These theoretical results are illustrated by finite-sample simulation experiments and the relevance of the proposed method is assessed by real data.
15:50–16:10: Coffee Break
Part II: Functional Analytic Learning Team Presentation
16:10–17:10 Minh Ha Quang
Title: An Optimal Transport and Information Geometric Framework for Positive Operators, Infinite-Dimensional Gaussian Measures, and Gaussian Processes
Abstract: The Wasserstein and Fisher-Rao distances are two central quantities arising from the fields of Optimal Transport and Information Geometry, respectively, together with their applications in machine learning and statistics. On the set of zero-mean Gaussian densities on Euclidean space, they both admit closed form formulas. In this talk, we present their generalization to the infinite-dimensional setting of Gaussian measures on Hilbert space and Gaussian processes. In general, the exact Fisher-Rao metric formulation is not generalizable on the set of all Gaussian measures on an infinite-dimensional Hilbert space. Instead, we show that on the set of all Gaussian measures which are equivalent to a fixed one, all finite-dimensional formulas admit direct generalization. By employing regularization, we then have a formulation that is valid for all Gaussian measures on Hilbert space. The Wasserstein distance, on the other hand, is valid for all Gaussian measures on Hilbert space. Nevertheless, we show that by employing entropic regularization, many favorable theoretical properties, including convergence and differentiability, can be obtained. In the setting of Gaussian processes, by reproducing kernel Hilbert space (RKHS) methodology, we obtain consistent finite-dimensional approximations of the infinite-dimensional quantities that can be practically employed.
17:10–17:40 Le Thanh Tam
Title: Optimal Transport on Tree Systems and Applications
Abstract: Optimal transport (OT) provides a set of powerful toolkits to compare measures. However, OT has a high computational complexity, i.e., super cubic w.r.t. the number of input supports. Several variants of Sliced Wasserstein (SW) have been developed in the literature to overcome this challenge. These approaches exploit the closed-form expression of the univariate OT by projecting input measures onto one-dimensional lines. However, projecting measures onto low-dimensional spaces can lead to a loss of topological information. To mitigate this issue, we propose to replace one-dimensional lines with a more advanced structure, called tree systems. This structure is metrizable by a tree metric, which yields a closed-form expression for OT on tree systems. We derive an extensive theoretical analysis to formally define tree systems, introduce the concept of splitting maps, propose novel variants of Radon transform for tree systems, and verify their infectivity. Empirically, we illustrate that the proposed approaches perform favorably compared to SW and its variants on applications with dynamic-support measures such as generative models, and diffusion models.
開催日
2025年5月21日10:00 ~ 2025年5月21日17:40
主催者・問い合わせ先
RIKEN AIP Public
開催場所
項目 | 内容 |
---|---|
場所 | 名称未設定 |
住所 | Nihonbashi 1-chome Mitsui Building, 15th floor, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 1030027, Japan |
開催場所の地図
SNS・Bookmark
近隣のイベント
- 2025年7月11日 - GVAビジネス交流会 経営者名刺交換会
- 2025年7月11日 - 第4回 Domo buddies Day ~仲間が拡がる、学びが加速、Domoを楽しむ一日〜 (仮)
- 2025年7月10日 - mitaka.rb 2025-07-10木
- 2025年7月10日 - Fun with English - もっと英語で話そう!(オンライン参加のみ)
- 2025年7月10日 - 東京:第2木曜日開催 オンライン/リアル【IT・ブログ・WordPress・Chat-GPT】グループレッスン
- 2025年7月10日 - 【緊急登壇:日本郵便デジタルアドレス特集】ビジネスマッチング部会
- 2025年7月10日 - 不動産テック協会協会会員限定イベント「マンション管理DX部会」第2回会合
- 2025年7月10日 - HDC 2025 DAY 1 Hands-on 2025
- 2025年7月8日 - GVAビジネス交流会 金融・士業・不動産名刺交換会
近隣の場所 (直線距離)
- オーナーズエージェント株式会社 (7.4km)
- Le Wagon Tokyo (8.1km)
- 〒161-0033 東京都新宿区下落合3丁目17-42 目白ファッション&アートカレッジ 1号館 207号 (7.7km)
- Datadog Japan合同会社 イベントスペース (951m)
- TODA HALL & CONFERENCE TOKYO (554m)
- ベルサール渋谷ガーデン (8km)
- ベルサール渋谷ガーデン (8km)
- タイ&ダイニングバー TRANSAM (3.4km)
- 渋谷ストリーム グーグル・クラウド・ジャパン合同会社 オフィス23F セミナールームSuna (7.1km)
- (集合場所)ラフィネ ランニングスタイル Neo 1F入口 (1.2km)
- (株) アイ・ティ・イノベーション (8.1km)
- 一橋講堂会議室201~203 (1.9km)
- IZAKAYA BAR おためし屋 (6.4km)
- 赤坂・溜池山王の貸スペース ペチャクチャ Room3 (3.1km)
- Kuromame 6 Hands-Onバーチャル会場(zoom/discord) (2.3km)
- リアクタージャパン株式会社 (6.9km)
- 文化総合センター大和田(学習室・アリーナ) 学習室4 (7.4km)
- 産業技術総合研究所 臨海副都心センター別館11階 (7.1km)
- 産業技術総合研究所 臨海副都心センター別館会議室 (7.1km)