Main |
Abstracts of courses |
Schedule |
Participants |
Registration |
Practical informations
Abstracts of the lectures
Each lecture lasts 4h30, divided in 3 to 4 parts given on the 4 days of the school.
Agnès Desolneux (centre Borelli, ENS Paris-Saclay)
OT for Gaussian Mixture Models : theory, applications and extensions.
In these lectures (based on joint works with Julie Delon), I will start
by introducing the distance \(MW_2\) between Gaussian Mixture Models (GMM)
that is obtained by constraining the set of couplings in the usual \(W_2\)
distance to be themselves GMMs. Then in a second part I will show some
applications of this distance, mainly for image color transfer, through
practical sessions with Python notebooks. And in a third part, I will
show extensions of \(MW_2\) to the case of GMMs in different Euclidean
spaces, using then the Gromov-Wasserstein distance.
webpage of Agnès Desolneux
Andrea Natale (INRIA Saclay)
Metric extrapolation, signed Wasserstein barycenters and applications.
Optimal Transport (OT) provides a powerful framework to compare data that lack a natural vector space structure in a geometrically meaningful way. Within this framework, Wasserstein geodesics and Wasserstein barycenters yield a natural notion of interpolation and averaging, better preserving data structures if compared to linear methods based on Euclidean embeddings. Yet, many applications -- such as the statistical analysis of measure-valued data, the development of numerical methods for density evolution models, or reduced-order models -- often require not only interpolation but also extrapolation of data, limiting the applicability of these concepts.
A natural way to extrapolate data consistently with the OT geometry is to allow for signed weights in the variational definition of Wasserstein barycenters. This modification, however, is far from trivial, as it fundamentally changes the structure of the problem. The key to analyzing and (in some cases) solving these signed barycenter problems lies in their strong connection with Weak Optimal Transport. The main objective of this mini-course is to elucidate this connection, providing a rigorous framework for the study of signed Wasserstein barycenters and practical numerical methods for their computation. We will also illustrate various applications of the developed tools, including the approximation of classical and linearized Wasserstein barycenters, OT clustering and quantization, and the discretization of Wasserstein gradient flows.
webpage of Andrea Natale
Guillaume Carlier (Dauphine - PSL Paris)
Wasserstein barycenters and applications.
The Wasserstein barycenter problem is a way to interpolate between a family of probability measures which has become quite popular in recent years. In this course, I will first present the basic theory for this problem (dual and multi-marginal formulations, connections to Monge-Ampère equations, integral estimates etc...) and then will discuss some numerical and stability aspects as well as some variants (entropic barycenters, Wasserstein medians...) and will also present some open questions.
webpage of Guillaume Carlier
Julio Backhoff
'Most Gaussian' martingales and nonlinear optimal transport
A martingale is a completely unbiased stochastic process, in the sense the best estimate about where the process will go in the future is its present position. In Martingale Optimal Transport (MOT) we are interested in determining optimal martingales, with respect to a given cost functional, between prescribed initial and terminal marginals. This problem has its origins in the question of robust pricing and hedging in mathematical finance, in which case the previously mentioned cost functional is linear. In more recent times, however, the question of model calibration has gained attention, and this has naturally led to nonlinear cost functionals. The goal of the mini-course is to discuss in detail two modern nonlinear MOT problems, which stand out for their striking similarity to more classical, non-martingale counterparts. In both cases we are motivated by the question:
What is the most Gaussian martingale between an initial marginal \(\mu\) and a terminal marginal \(\nu\)?
In the first interpretation of this question, we understand 'most Gaussian' as 'close to Gaussian in Wasserstein distance'. This leads us to the notion of Bass martingales. If \(\mu\) is the Dirac measure concentrated on the mean of \(\nu\), then the associated Bass martingale is just \((\mathbb{E}[f(Z)],f(Z))\), with \(Z\) standard Gaussian and \(f\) the gradient of a suitably chosen convex function. More generally, we allow for an extra independent random vector \(A\), and the associated Bass martingale is \((f_0(A),f(A+Z))\), with \(f_0\) the convolution between \(f\) and the standard Gaussian measure. Arguably, these objects are the martingale analogues to Brenier maps in optimal transport. In this part of the mini-course we discuss necessary and sufficient conditions for their existence, their optimality, as well the relation to the dual problem and the concept of irreducibility in MOT. We also discuss computational methods, such as an iterative scheme and a gradient flow formulation.
In the second interpretation of the above question, we understand 'most Gaussian' as 'close to Gaussian in relative entropy'. This leads now to the notion of Schrödinger martingales. Starting from an auxiliary measure \(\bar{\mu}\), let \((\bar X,Y)\) be the solution to the Entropic Optimal Transport problem (also known as Schrödinger problem) from \(\bar \mu\) to \(\nu\) with quadratic cost function, and define \(X:=\mathbb{E}[Y|\bar{X}]\), the so-called barycentric projection. If \(X\) happens to have law \(\mu\), then \((X,Y)\) is the Schrödinger martingale from \(\mu\) to \(\nu\). In this part of the mini-course we discuss the existence of these martingales, their optimality, the relation to a dual problem and irreducibility, and a natural iterative scheme.
The aforementioned nonlinear MOT problems are in fact particular cases of Weak Optimal Transport (WOT). Hence we start the mini-course with a gentle introduction to WOT (a far-reaching nonlinear generalization of classical optimal transport), serving as a backbone for the rest of the course. Since no familiarity with MOT or martingales is assumed, we also present these subjects in a self-contained way from the very beginning.
webpage of Julio Backhoff