Foundations and Frontiers: Interdisciplinary Perspectives on Mathematical Optimization — Joint Event by the Continuous Optimization Team and Functional Analytic Learning Team
イベント説明
Part I: Continuous Optimization Team Presentation
10:00–10:30 Akiko Takeda
Title: Introduction of Continuous Optimization Team
Abstract: Our team, which was established in September 2016, is about to enter its 10th year this September. We will review some of our past research results and discuss future prospects.
10:30–11:00 Pierre-Louis Poirion
Title: Random Subspace Newton and Quasi-Newton Algorithms
Abstract: We present a randomized subspace regularized Newton method for a non-convex optimization. We will study global and local convergence properties of the method and prove that it works particularly well in a low rank setting. We will also present a randomized quasi-Newton method.
11:00–11:30 Jan Harold Alcantara
Title: Recent Developments in Splitting Algorithms for Nonconvex Optimization and Nonmonotone Inclusion
Abstract: Modern optimization methods increasingly leverage splitting algorithms to exploit problem structure and enable more efficient computation. In this talk, we review recent developments in the analysis of such methods within nonsmooth and nonconvex optimization, with a focus on structured nonconvex problems. We present global subsequential convergence guarantees under specific assumptions. We then extend this perspective to a broader class of problems—namely, multi-operator nonmonotone inclusion problems. In particular, we show how the Douglas–Rachford algorithm can be generalized to this multi-operator setting, and we establish conditions under which convergence can still be rigorously ensured.
11:30–11:45 Coffee Break
11:45–12:30 Christophe Roux
Title: Implicit Riemannian Optimism with Applications to Min-Max Problems
Abstract: Many optimization problems such as eigenvalue problems, principal component analysis and low-rank matrix completion can be interpreted as optimization problems over Riemannian manifolds, which allows for exploiting the geometric structure of the problems. While Riemannian optimization has been studied extensively in the offline setting, the online setting is not well understood. A major challenge in prior works was handling in-manifold constraints that arise in the online setting. We leverage implicit methods to address this problem and improve over existing results, removing strong assumptions and matching the best known regret bounds in the Euclidean setting. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.
13:30–14:30 Andreas Themelis
Title: It's All in the Envelope! A Smoother Approach to Splitting Algorithms
Abstract: Splitting algorithms, such as the proximal gradient method, ADMM, and Douglas-Rachford splitting, are fundamental tools for solving structured optimization problems by decomposing them into simpler, more manageable subproblems. Because of their simplicity and modularity, a lot of research has been devoted to understanding and possibly improving their convergence behavior, especially in nonconvex settings.
This talk offers a walkthrough on the use of "proximal envelopes" as a unifying framework for analyzing splitting methods. Much like the Moreau envelope gives a smooth interpretation of the proximal point method for convex problems, these envelope functions allow us to view various splitting algorithms, even in absence of convexity, through the lens of a "nonsmooth" gradient descent applied to a more regular surrogate. This perspective not only aids in theoretical analysis and convergence guarantees, but also paves the way to "acceleration" techniques that preserve the structure and simplicity of the original methods.
14:50–15:50 Benjamin Poignard
Title: Sparse Factor Models of High Dimension
Abstract: We consider the estimation of a sparse factor model where the factor loading matrix is assumed sparse. The estimation problem is reformulated as a penalized M-estimation criterion, while the restrictions for identifying the factor loading matrix accommodate a wide range of sparsity patterns. We prove the sparsistency property of the penalized estimator when the number of parameters is diverging, that is the consistency of the estimator and the recovery of the true zeros entries. These theoretical results are illustrated by finite-sample simulation experiments and the relevance of the proposed method is assessed by real data.
15:50–16:10: Coffee Break
Part II: Functional Analytic Learning Team Presentation
16:10–17:10 Minh Ha Quang
Title: An Optimal Transport and Information Geometric Framework for Positive Operators, Infinite-Dimensional Gaussian Measures, and Gaussian Processes
Abstract: The Wasserstein and Fisher-Rao distances are two central quantities arising from the fields of Optimal Transport and Information Geometry, respectively, together with their applications in machine learning and statistics. On the set of zero-mean Gaussian densities on Euclidean space, they both admit closed form formulas. In this talk, we present their generalization to the infinite-dimensional setting of Gaussian measures on Hilbert space and Gaussian processes. In general, the exact Fisher-Rao metric formulation is not generalizable on the set of all Gaussian measures on an infinite-dimensional Hilbert space. Instead, we show that on the set of all Gaussian measures which are equivalent to a fixed one, all finite-dimensional formulas admit direct generalization. By employing regularization, we then have a formulation that is valid for all Gaussian measures on Hilbert space. The Wasserstein distance, on the other hand, is valid for all Gaussian measures on Hilbert space. Nevertheless, we show that by employing entropic regularization, many favorable theoretical properties, including convergence and differentiability, can be obtained. In the setting of Gaussian processes, by reproducing kernel Hilbert space (RKHS) methodology, we obtain consistent finite-dimensional approximations of the infinite-dimensional quantities that can be practically employed.
17:10–17:40 Le Thanh Tam
Title: Optimal Transport on Tree Systems and Applications
Abstract: Optimal transport (OT) provides a set of powerful toolkits to compare measures. However, OT has a high computational complexity, i.e., super cubic w.r.t. the number of input supports. Several variants of Sliced Wasserstein (SW) have been developed in the literature to overcome this challenge. These approaches exploit the closed-form expression of the univariate OT by projecting input measures onto one-dimensional lines. However, projecting measures onto low-dimensional spaces can lead to a loss of topological information. To mitigate this issue, we propose to replace one-dimensional lines with a more advanced structure, called tree systems. This structure is metrizable by a tree metric, which yields a closed-form expression for OT on tree systems. We derive an extensive theoretical analysis to formally define tree systems, introduce the concept of splitting maps, propose novel variants of Radon transform for tree systems, and verify their infectivity. Empirically, we illustrate that the proposed approaches perform favorably compared to SW and its variants on applications with dynamic-support measures such as generative models, and diffusion models.
開催日
2025年5月21日10:00 ~ 2025年5月21日17:40
主催者・問い合わせ先
RIKEN AIP Public
開催場所
| 項目 | 内容 |
|---|---|
| 場所 | 名称未設定 |
| 住所 | Nihonbashi 1-chome Mitsui Building, 15th floor, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 1030027, Japan |
開催場所の地図
SNS・Bookmark
近隣のイベント
- 2025年1月28日 - ハードニング・トークライブ! / WASNight 2025 Kick-Off
- 2025年1月27日 - 皇居ラン! 〜ITエンジニアの健康習慣作り〜
- 2025年1月27日 - 【オフライン】JJUGナイトセミナー「新春Java2025」1/27(月) 開催
- 2025年1月27日 - 速報!!【お肌も髪もツヤツヤになる
- 2025年1月26日 - 昼:中国語会話テーブル(オンライン&会場参加)
- 2025年1月26日 - 銀座SOLA10周年記念イベント「栗原譜紗子の感謝のライブ」
- 2025年1月26日 - 【東京・目白】プチマイクラデイ【1/26】
- 2025年1月26日 - 子どものためのプログラミング道場 第71回 CoderDojo多摩センター (1月度)
- 2025年1月26日 - 第97回 CoderDojo 調布
近隣の場所 (直線距離)
- THEATER WWW (1.3km)
- ちばチャン池袋本店 (7.5km)
- 株式会社SHIFT (3.9km)
- IREM JAPAN セミナールーム (6.1km)
- 渋谷 ジェイフィール ホール (6.8km)
- 株式会社ヌーラボ 東京事務所 (3.5km)
- NETGEAR Japan (870m)
- 虎ノ門ヒルズフォーラム(ホールB)- Breakout Room A / トラック:Into Tokyo (2.9km)
- TENCUPS (7km)
- 株式会社KDDIウェブコミュニケーションズ本社 KDDI Digital Divergenceカンファレンスルーム (3.1km)
- モスバーガー 東武池袋店 (MOS BURGER) (7.8km)
- GMO Yours・フクラス (7.3km)
- 渋谷駅か、恵比寿駅から徒歩10分 (7.2km)
- アットビジネスセンター八重洲通り会議室501 (622m)
- テックジム東京本校 (3.1km)
- 青山学院大学 青山キャンパス 10号館 地下1階 (つくまなラボ) (6.3km)
- DDS社セミナールーム (4.8km)
- 東京支社(東京スクエアガーデン14F) (910m)
- 東京カルチャーカルチャー (6.9km)