Foundations and Frontiers: Interdisciplinary Perspectives on Mathematical Optimization — Joint Event by the Continuous Optimization Team and Functional Analytic Learning Team
イベント説明
Part I: Continuous Optimization Team Presentation
10:00–10:30 Akiko Takeda
Title: Introduction of Continuous Optimization Team
Abstract: Our team, which was established in September 2016, is about to enter its 10th year this September. We will review some of our past research results and discuss future prospects.
10:30–11:00 Pierre-Louis Poirion
Title: Random Subspace Newton and Quasi-Newton Algorithms
Abstract: We present a randomized subspace regularized Newton method for a non-convex optimization. We will study global and local convergence properties of the method and prove that it works particularly well in a low rank setting. We will also present a randomized quasi-Newton method.
11:00–11:30 Jan Harold Alcantara
Title: Recent Developments in Splitting Algorithms for Nonconvex Optimization and Nonmonotone Inclusion
Abstract: Modern optimization methods increasingly leverage splitting algorithms to exploit problem structure and enable more efficient computation. In this talk, we review recent developments in the analysis of such methods within nonsmooth and nonconvex optimization, with a focus on structured nonconvex problems. We present global subsequential convergence guarantees under specific assumptions. We then extend this perspective to a broader class of problems—namely, multi-operator nonmonotone inclusion problems. In particular, we show how the Douglas–Rachford algorithm can be generalized to this multi-operator setting, and we establish conditions under which convergence can still be rigorously ensured.
11:30–11:45 Coffee Break
11:45–12:30 Christophe Roux
Title: Implicit Riemannian Optimism with Applications to Min-Max Problems
Abstract: Many optimization problems such as eigenvalue problems, principal component analysis and low-rank matrix completion can be interpreted as optimization problems over Riemannian manifolds, which allows for exploiting the geometric structure of the problems. While Riemannian optimization has been studied extensively in the offline setting, the online setting is not well understood. A major challenge in prior works was handling in-manifold constraints that arise in the online setting. We leverage implicit methods to address this problem and improve over existing results, removing strong assumptions and matching the best known regret bounds in the Euclidean setting. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.
13:30–14:30 Andreas Themelis
Title: It's All in the Envelope! A Smoother Approach to Splitting Algorithms
Abstract: Splitting algorithms, such as the proximal gradient method, ADMM, and Douglas-Rachford splitting, are fundamental tools for solving structured optimization problems by decomposing them into simpler, more manageable subproblems. Because of their simplicity and modularity, a lot of research has been devoted to understanding and possibly improving their convergence behavior, especially in nonconvex settings.
This talk offers a walkthrough on the use of "proximal envelopes" as a unifying framework for analyzing splitting methods. Much like the Moreau envelope gives a smooth interpretation of the proximal point method for convex problems, these envelope functions allow us to view various splitting algorithms, even in absence of convexity, through the lens of a "nonsmooth" gradient descent applied to a more regular surrogate. This perspective not only aids in theoretical analysis and convergence guarantees, but also paves the way to "acceleration" techniques that preserve the structure and simplicity of the original methods.
14:50–15:50 Benjamin Poignard
Title: Sparse Factor Models of High Dimension
Abstract: We consider the estimation of a sparse factor model where the factor loading matrix is assumed sparse. The estimation problem is reformulated as a penalized M-estimation criterion, while the restrictions for identifying the factor loading matrix accommodate a wide range of sparsity patterns. We prove the sparsistency property of the penalized estimator when the number of parameters is diverging, that is the consistency of the estimator and the recovery of the true zeros entries. These theoretical results are illustrated by finite-sample simulation experiments and the relevance of the proposed method is assessed by real data.
15:50–16:10: Coffee Break
Part II: Functional Analytic Learning Team Presentation
16:10–17:10 Minh Ha Quang
Title: An Optimal Transport and Information Geometric Framework for Positive Operators, Infinite-Dimensional Gaussian Measures, and Gaussian Processes
Abstract: The Wasserstein and Fisher-Rao distances are two central quantities arising from the fields of Optimal Transport and Information Geometry, respectively, together with their applications in machine learning and statistics. On the set of zero-mean Gaussian densities on Euclidean space, they both admit closed form formulas. In this talk, we present their generalization to the infinite-dimensional setting of Gaussian measures on Hilbert space and Gaussian processes. In general, the exact Fisher-Rao metric formulation is not generalizable on the set of all Gaussian measures on an infinite-dimensional Hilbert space. Instead, we show that on the set of all Gaussian measures which are equivalent to a fixed one, all finite-dimensional formulas admit direct generalization. By employing regularization, we then have a formulation that is valid for all Gaussian measures on Hilbert space. The Wasserstein distance, on the other hand, is valid for all Gaussian measures on Hilbert space. Nevertheless, we show that by employing entropic regularization, many favorable theoretical properties, including convergence and differentiability, can be obtained. In the setting of Gaussian processes, by reproducing kernel Hilbert space (RKHS) methodology, we obtain consistent finite-dimensional approximations of the infinite-dimensional quantities that can be practically employed.
17:10–17:40 Le Thanh Tam
Title: Optimal Transport on Tree Systems and Applications
Abstract: Optimal transport (OT) provides a set of powerful toolkits to compare measures. However, OT has a high computational complexity, i.e., super cubic w.r.t. the number of input supports. Several variants of Sliced Wasserstein (SW) have been developed in the literature to overcome this challenge. These approaches exploit the closed-form expression of the univariate OT by projecting input measures onto one-dimensional lines. However, projecting measures onto low-dimensional spaces can lead to a loss of topological information. To mitigate this issue, we propose to replace one-dimensional lines with a more advanced structure, called tree systems. This structure is metrizable by a tree metric, which yields a closed-form expression for OT on tree systems. We derive an extensive theoretical analysis to formally define tree systems, introduce the concept of splitting maps, propose novel variants of Radon transform for tree systems, and verify their infectivity. Empirically, we illustrate that the proposed approaches perform favorably compared to SW and its variants on applications with dynamic-support measures such as generative models, and diffusion models.
開催日
2025年5月21日10:00 ~ 2025年5月21日17:40
主催者・問い合わせ先
RIKEN AIP Public
開催場所
項目 | 内容 |
---|---|
場所 | 名称未設定 |
住所 | Nihonbashi 1-chome Mitsui Building, 15th floor, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 1030027, Japan |
開催場所の地図
SNS・Bookmark
近隣のイベント
- 2025年5月31日 - 【座学&実技】(昼の部)模型電飾工作入門
- 2025年5月31日 - 実践で学ぶPython速習講座【初心者を3.5時間で卒業せよ!!】
- 2025年5月31日 - 第96回・季節の荒川ハーフマラソン
- 2025年5月31日 - ピーエスエス皇居健康ランニング令和7年5月31日大会
- 2025年5月31日 - TokyoGrandTrail2025(60km)
- 2025年5月31日 - ケガをしない走り方講習会
- 2025年5月30日 - 【金曜日・予約制】Nerima Base 模型工作室(高校生以下無料!)- Crafting Room Event
- 2025年5月30日 - 【現地参加】深層学習WGによる実証から社会実装へのAI開発プロセス【2025年度深層WG活動説明会(AITeC Salon)】
- 2025年5月30日 - Luaスクリプトと Wireshark カスタマイズ 入門
近隣の場所 (直線距離)
- 東京都立産業貿易センター 浜松町館 第2会議室B (3.3km)
- お申し込みされた方にお知らせします (1.5km)
- ワテラス コモン ホール (1.7km)
- IZAKAYA BAR おためし屋 (6.4km)
- 表参道キックボクシングパーソナルスタジオ (6.2km)
- DeloitteTohmatsu InnovationPark (1.2km)
- 品川 Spaces (6.8km)
- Nihonbashi AIP Center Open Space (0m)
- オーナーズエージェント株式会社 (7.4km)
- Le Wagon Tokyo (8.1km)
- 〒161-0033 東京都新宿区下落合3丁目17-42 目白ファッション&アートカレッジ 1号館 207号 (7.7km)
- Datadog Japan合同会社 イベントスペース (951m)
- TODA HALL & CONFERENCE TOKYO (554m)
- ベルサール渋谷ガーデン (8km)
- ベルサール渋谷ガーデン (8km)
- タイ&ダイニングバー TRANSAM (3.4km)
- 渋谷ストリーム グーグル・クラウド・ジャパン合同会社 オフィス23F セミナールームSuna (7.1km)
- (集合場所)ラフィネ ランニングスタイル Neo 1F入口 (1.2km)
- (株) アイ・ティ・イノベーション (8.1km)