Foundations and Frontiers: Interdisciplinary Perspectives on Mathematical Optimization — Joint Event by the Continuous Optimization Team and Functional Analytic Learning Team
イベント説明
Part I: Continuous Optimization Team Presentation
10:00–10:30 Akiko Takeda
Title: Introduction of Continuous Optimization Team
Abstract: Our team, which was established in September 2016, is about to enter its 10th year this September. We will review some of our past research results and discuss future prospects.
10:30–11:00 Pierre-Louis Poirion
Title: Random Subspace Newton and Quasi-Newton Algorithms
Abstract: We present a randomized subspace regularized Newton method for a non-convex optimization. We will study global and local convergence properties of the method and prove that it works particularly well in a low rank setting. We will also present a randomized quasi-Newton method.
11:00–11:30 Jan Harold Alcantara
Title: Recent Developments in Splitting Algorithms for Nonconvex Optimization and Nonmonotone Inclusion
Abstract: Modern optimization methods increasingly leverage splitting algorithms to exploit problem structure and enable more efficient computation. In this talk, we review recent developments in the analysis of such methods within nonsmooth and nonconvex optimization, with a focus on structured nonconvex problems. We present global subsequential convergence guarantees under specific assumptions. We then extend this perspective to a broader class of problems—namely, multi-operator nonmonotone inclusion problems. In particular, we show how the Douglas–Rachford algorithm can be generalized to this multi-operator setting, and we establish conditions under which convergence can still be rigorously ensured.
11:30–11:45 Coffee Break
11:45–12:30 Christophe Roux
Title: Implicit Riemannian Optimism with Applications to Min-Max Problems
Abstract: Many optimization problems such as eigenvalue problems, principal component analysis and low-rank matrix completion can be interpreted as optimization problems over Riemannian manifolds, which allows for exploiting the geometric structure of the problems. While Riemannian optimization has been studied extensively in the offline setting, the online setting is not well understood. A major challenge in prior works was handling in-manifold constraints that arise in the online setting. We leverage implicit methods to address this problem and improve over existing results, removing strong assumptions and matching the best known regret bounds in the Euclidean setting. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.
13:30–14:30 Andreas Themelis
Title: It's All in the Envelope! A Smoother Approach to Splitting Algorithms
Abstract: Splitting algorithms, such as the proximal gradient method, ADMM, and Douglas-Rachford splitting, are fundamental tools for solving structured optimization problems by decomposing them into simpler, more manageable subproblems. Because of their simplicity and modularity, a lot of research has been devoted to understanding and possibly improving their convergence behavior, especially in nonconvex settings.
This talk offers a walkthrough on the use of "proximal envelopes" as a unifying framework for analyzing splitting methods. Much like the Moreau envelope gives a smooth interpretation of the proximal point method for convex problems, these envelope functions allow us to view various splitting algorithms, even in absence of convexity, through the lens of a "nonsmooth" gradient descent applied to a more regular surrogate. This perspective not only aids in theoretical analysis and convergence guarantees, but also paves the way to "acceleration" techniques that preserve the structure and simplicity of the original methods.
14:50–15:50 Benjamin Poignard
Title: Sparse Factor Models of High Dimension
Abstract: We consider the estimation of a sparse factor model where the factor loading matrix is assumed sparse. The estimation problem is reformulated as a penalized M-estimation criterion, while the restrictions for identifying the factor loading matrix accommodate a wide range of sparsity patterns. We prove the sparsistency property of the penalized estimator when the number of parameters is diverging, that is the consistency of the estimator and the recovery of the true zeros entries. These theoretical results are illustrated by finite-sample simulation experiments and the relevance of the proposed method is assessed by real data.
15:50–16:10: Coffee Break
Part II: Functional Analytic Learning Team Presentation
16:10–17:10 Minh Ha Quang
Title: An Optimal Transport and Information Geometric Framework for Positive Operators, Infinite-Dimensional Gaussian Measures, and Gaussian Processes
Abstract: The Wasserstein and Fisher-Rao distances are two central quantities arising from the fields of Optimal Transport and Information Geometry, respectively, together with their applications in machine learning and statistics. On the set of zero-mean Gaussian densities on Euclidean space, they both admit closed form formulas. In this talk, we present their generalization to the infinite-dimensional setting of Gaussian measures on Hilbert space and Gaussian processes. In general, the exact Fisher-Rao metric formulation is not generalizable on the set of all Gaussian measures on an infinite-dimensional Hilbert space. Instead, we show that on the set of all Gaussian measures which are equivalent to a fixed one, all finite-dimensional formulas admit direct generalization. By employing regularization, we then have a formulation that is valid for all Gaussian measures on Hilbert space. The Wasserstein distance, on the other hand, is valid for all Gaussian measures on Hilbert space. Nevertheless, we show that by employing entropic regularization, many favorable theoretical properties, including convergence and differentiability, can be obtained. In the setting of Gaussian processes, by reproducing kernel Hilbert space (RKHS) methodology, we obtain consistent finite-dimensional approximations of the infinite-dimensional quantities that can be practically employed.
17:10–17:40 Le Thanh Tam
Title: Optimal Transport on Tree Systems and Applications
Abstract: Optimal transport (OT) provides a set of powerful toolkits to compare measures. However, OT has a high computational complexity, i.e., super cubic w.r.t. the number of input supports. Several variants of Sliced Wasserstein (SW) have been developed in the literature to overcome this challenge. These approaches exploit the closed-form expression of the univariate OT by projecting input measures onto one-dimensional lines. However, projecting measures onto low-dimensional spaces can lead to a loss of topological information. To mitigate this issue, we propose to replace one-dimensional lines with a more advanced structure, called tree systems. This structure is metrizable by a tree metric, which yields a closed-form expression for OT on tree systems. We derive an extensive theoretical analysis to formally define tree systems, introduce the concept of splitting maps, propose novel variants of Radon transform for tree systems, and verify their infectivity. Empirically, we illustrate that the proposed approaches perform favorably compared to SW and its variants on applications with dynamic-support measures such as generative models, and diffusion models.
開催日
2025年5月21日10:00 ~ 2025年5月21日17:40
主催者・問い合わせ先
RIKEN AIP Public
開催場所
項目 | 内容 |
---|---|
場所 | 名称未設定 |
住所 | Nihonbashi 1-chome Mitsui Building, 15th floor, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 1030027, Japan |
開催場所の地図
SNS・Bookmark
近隣のイベント
- 2025年1月13日 - 【初心者向け】ちょっとだけサーバを作って動かしてみる勉強会
- 2025年1月13日 - 【特別開催】AWS ハンズオン実践(Linux/CentOS編) キャリアパス・ワークショップ @秋葉原
- 2025年1月12日 - ♪銀座SOLA×MAMEHICO 「古賀久士ライヴ」
- 2025年1月12日 - 昼:中国語会話テーブル(オンライン&会場参加)
- 2025年1月11日 - 【土曜日・予約制】Nerima Base 模型工作室(高校生以下無料!)- Crafting Room Event
- 2025年1月11日 - Python短期集中講座【クラス・インスタンス編!!】
- 2025年1月9日 - mitaka.rb 2025-01-09木
- 2025年1月9日 - 【東京 ブログ】グループセッション 1月9日
- 2025年1月6日 - 永田町Pythonキャンプ【20日短期集中プログラミング講座】
近隣の場所 (直線距離)
- ビジョンセンター品川 3F 305 (6.9km)
- アットビジネスセンター渋谷東口駅前 402号室 (6.9km)
- 新宿住友ビル47階 スカイルームRoom3 (7.5km)
- TOKYO L.O.C.A.L BASE (6.8km)
- アクセス渋谷フォーラム(スペースC) (6.8km)
- 大手町プレイス ホール&カンファレンス (784m)
- 国立研究開発法人 産業技術総合研究所 臨海副都心センター別館10階会議室(別館1階受付までお越しください。) (7.1km)
- レッドハット株式会社 3F セミナールーム (7.1km)
- コワーキングスペース秋葉原 Weeyble (1.7km)
- お申し込みされた方にお知らせします (1.5km)
- LINEヤフー株式会社 紀尾井町オフィス (3.5km)
- 文化総合センター大和田(学習室・アリーナ) 部屋 学習室4 (7.4km)
- LINEヤフー株式会社 セミナー会場 & Zoom (3.4km)
- 522号室 (3.4km)
- FLAT BASE (5.3km)
- JPタワー TECラウンジ + ホール&カンファレンス カンファレンスB2 (920m)
- アシスト 市ヶ谷本社1階 セミナールーム (3.4km)
- Institut Français du Japon – Tokyo (3.5km)
- 534号室 (3.4km)