home page home page of Onno van Gaans

Topics in Analysis -- Gradient flows in metric spaces, Spring 2011

assignments

The third assignment is available. Due: June 10, 2011.

The second assignment is available. Due: April 20, 2011.

The first assignment is available. Due: March 9, 2011.

The schedule of the Assignments has been changed. The third assignment will be due on May 4, and the fourth on May 18. There will be no fifth assignment.

course notes

NEW UPDATED course notes are available (version of 11 MAY 2011), including gradient flows, Wasserstein spaces and geodesic. There is still some information on generalized geodesics and the Fokker-Planck equation to be added.

Lecture 1 (Feb 2): Introduction and measures on metric spaces. Read Chapter 1 and 2.1 and 2.2.

Lecture 2 (Feb 9): Convergence of measures, bounded Lipschitz metric. Read 2.3, 2.4.

Lecture 3 (Feb 16): Measures as functionals, Prokhorov's theorem, image measures, marginals, introduction to disintegration. Read 2.5, 2.6.

Next lecture: regular conditional probabilities, proof of disintegration theorem.

topic

A gradient flow is a function y satisfying y'(t)=F(y(t)), where F=-grad H for some (convex) potential function H. Thus y describes a flow in the direction of the steepest descent of H. During the last decade a theory for gradient flows in metric spaces has been developed with many connections with other active fields of research.

There are more general settings in which gradient flows can be formulated. Under suitable conditions, a gradient flow differential equation can be formulated in a Hilbert space. It turns out that many partial differential equations can be reformulated as gradient flows in Hilbert spaces, for instance the heat equation. There are also generalizations to Banach spaces and to manifolds. During the last decade a generalization to metric spaces has been developed. Since there is no way to formulate a difference quotient in a general metric space, a gradient flow differential equation cannot be immediately formulated in a metric space. In a Hilbert space space there is an equivalent formulation as an Evolutionary Variational Inequality (EVI), which only uses the metric of a Hilbert space. This formulation can be taken in an arbitrary metric space and it turns out that existence and uniqueness of solutions and some properties similar to the Hilbert space case can be proved under suitable `convexity' assumptions on the potential function H (Ambrosio, Gigli and Savare (2005)).

Such gradient flow equations are particularly interesting in spaces of probability measures.

The set of all Borel probability measures on a separable complete metric space can be endowed with a metric which makes it also separable and complete. Convergence with this metric is similar to the concept of `convergence in distribution' from probability theory. If the measures have `finite second moments', one such a metric is the Wasserstein metric.

The Wasserstein metric is related to the theory of optimal transportation. The idea of optimal transportation problems is that some distribution of mass in a metric space X has to be moved to another distribution of mass on that metric space. The `cost' of the movement depends on the amount of mass and the distance over which it is moved. There are cheaper and more expansive ways to transport the mass from the initial to the desired end state. Monge posed the problem of finding the optimal way of transporting and the optimal cost. It was Kantorovich who reformulated the problem in measure theoretic terms, allowing for more general `transports' than Monge did. Kantorovich sees a `transport plan' as a measure on X x X such that its first marginal equals the initial mass distribution and its second marginal the desired end distribution. The costs of the transportation are given by the integral of the square of the metric d(x,y)^2 with respect to the transport plan. It can be shown that there always exists an optimal transport plan, provided X is separable and complete. The square root of the optimal cost is defined to be the Wasserstein distance between the initial measure and the end measure.

In real life systems suffer a continuous load of influences from their surroundings, known as `noise'. A standard way of modelling noise is by means of a stochastic process (often Brownian motion, which is a continuous limit of a random walk). If a system is described by a differential equation (or a system of such equations or a partial differential equation), then it is natural to model the effect of the noise on the system by adding a stochastic process to the equation. We then obtain a stochastic differential equation. The solution of a stochastic differential equation is a stochastic process. Hence at each time t the solution is a random variable, which has a probability distribution, which describes with what probability the system will be in what state. These distributions together constitute a curve in the space of probability measures on the state space of the system.

Jordan, Kinderlehrer and Otto (1998) showed that the curve of distributions of solutions of certain stochastic differential equations are gradient flows in the space of probability measures endowed with the Wasserstein metric. Thus the theories of gradient flows (and partial differential equations), optimal transportation problems, and stochastic differential equations are linked.

The course will first study probability measures on metric spaces and some more advanced topics from measure theory such as Prokhorov's theorem and the disintegration theorem. Then an introduction to optimal transportation theory will be given, including existence of optimal transportation plans, which follows from Prokhorov's theorem. Further we will consider gradient flow differential equations in Hilbert spaces and the formulation of such equations in metric spaces. This will include some aspects of convex analysis. Finally a brief introduction to stochastic differential equations will be given and the relation with the gradient flows in Wasserstein spaces and the Fokker-Planck equation will be explained.

general

Lectures on Wednesdays 11:15-13:00h, Snellius Room 401.

Here is the program of the course with further details.