Additional Colloquium with our Mercator Fellows Sean Meyn and Youssef M. Marzouk
Sean Meyn and Youssef M. Marzouk , University of Florida and MIT Campus Golm, Building 28, Room 0.10810:30 - 11:30
Our Mercator Fellows Sean Meyn and Youssef M. Marzouk will visit us in the week from June 27th to July 1st. We will use this opportunity to have an additional SFB Colloquium with two short talks. Please note that we will start at 10:30 am.
A Not So Random Walk around Extremum Seeking Control - Sean Meyn
How can you optimize a function based on observations, without any way of computing a gradient? Kiefer and Wolfowitz proposed a solution in the early 1950s based on stochastic approximation, and in the 1920s a railway engineer proposed an entirely deterministic approach that is now known as Extremum Seeking Control (ESC). Once you understand the ESC architecture you will find that the ideas are very similar. A fundamental difference is that random noise is replaced with sinusoids for exploration.
The punchline is that this observation leads to new architectures for ESC that are far more easily interpreted, globally stable, and have astonishingly fast rates of convergence. These findings are based on an emerging theory of Quasi Stochastic Approximation.
Conditional sampling and joint dimension reduction, with applications to data assimilation - Youssef M. Marzouk
In many Bayesian inference problems, evaluations of the likelihood function or prior density are unavailable or intractable; instead one can only simulate (i.e., draw samples from) the associated distributions. I will discuss how transportation of measure can help perform inference in this setting, by first "learning" a joint parameter-data prior and then constructing transport maps that push these prior samples to the desired conditional distribution. These methods have broad utility for inference in stochastic models, including nonlinear filtering and smoothing.
Scalability of these methods to high dimensions can be aided by dimension reduction—specifically, by joint dimension reduction for parameters and data. To this end, I will describe an information theoretic perspective on dimension reduction that seeks both an ``informed'' subspace of the parameters and an ``informative'' subspace of the data. Inference can then proceed in the associated lower-dimensional coordinates, with control over the associated posterior approximation error.
- this is joint work with Ricardo Baptista, Max Ramgraber, and Olivier Zahm.