A06 – Approximative Bayesian inference and model selection for stochastic differential equations (SDEs)

The project is concerned with semi-parametric and fully non-parametric approaches to Bayesian inference in the context of stochastic processes either in the form of dynamic point processes or in the form of differential equations. The goal of the project is to both advance the theoretical understanding of high-dimensional inference as well as to develop and analyze novel algorithmic approaches within the setting of Monte Carlo methods or variational inference. The algorithmic advances have also been inspired by a number of in-depth collaborations within several projects from Project Group B of the CRC.

During the first funding period, general properties of posterior distributions and their dependence on the prior, the effective dimension of the problem, and sample sizes have been obtained. The accuracy of a Gaussian approximation to the posterior distribution has been studied in particular. Continuous-time McKean-Vlasov formulations for combined state and parameter estimation have been derived allowing for the derivation of posterior contractions in terms of the selected prior distributions. On the algorithmic side, non-parametric Bayesian inference based on Gaussian processes has been extended to dynamic point processes. In a further line of research, affine-invariant interacting particle systems for sampling posterior distributions have been proposed and analyzed. These methods have been inspired by the ensemble Kalman filter. The proposed inference methods have been utilized in projects B03 and B04 in order to derive new statistical models for seismology and scene viewing including their inference.

The project team will continue exploring non-parametric Bayesian methods for nonlinear inverse problems during the second funding period. In a first line of research, nonlinear inverse problems will be approached via the, so called, calming approach, which can be viewed as an appropriately extended and decoupled reformulation of the original problem. The calming approach allows for sharper estimates and new algorithmic approaches. In a second line of research, the problem of parametric and non-parametric drift estimation for stochastic differential equations will be studied further using random feature maps and gradient-log density estimators. Interacting particle representations and their mean-field limits will provide one of the key theoretical and algorithmic ingredients. The team will also extend its previous work on dynamic point process models to noisy and incomplete data sets based on variational approximations to the posterior distribution as well as its work on posterior contractions rates for combined state and parameter estimation for linear and semi-linear stochastic partial differential equations.

The project will closely interact with projects A01, A02, A04 and A07 on the theory side as well as with projects B03, B04, B06, B07, B08, and B09 in terms of applications.

  • Gottwald, G., Li, F., Marzouk, Y., Reich, S (2024). Stable generative modelling using diffusion maps. arXiv 2401.04372

  • Boys, B., Girolami, M., Pidstrigach, J., Reich, S., Mosca, A., and Akyildiz, O.D. (2023). Tweedie Moment Projected Diffusions For Inverse Problems arXiv 2310.06721

  • Chen, Y, Huang D.Z., Huang J., Reich, S., and Stuart, A.M. (2023). Sampling via gradient flows in the space of probability measures arXiv:2310.03597

  • Pidstrigach, J., Marzouk, Y., Reich, S., and Wang, S. (2023). Infinite-Dimensional Diffusion Models arXiv 2302.10130

  • Liu, S., Reich, S., and Tong, X.T. (2023). Dropout ensemble Kalman inversion for high dimensional inverse problems arXiv:2308.16784

  • Chen, Y, Huang D.Z., Huang J., Reich, S., and Stuart, A.M. (2023). Gradient flows for sampling: Mean-field models, Gaussian approximations and affine invariance arXiv:2302.11024

  • Pidstrigach, J., Marzouk, Y., Reich, S., and Wang., S. (2023). Infinite-dimensional diffusion models for function spaces arXiv:2302.10130

  • Reich, S. (2022). Frequentist perspective on robust parameter estimation using the ensemble Kalman filter arXiv:2201.000611

  • Dietrich, F., Makeev, A., Kevrekidis, G., Evangelou, N., Bertalan, T., Reich, S., and Kevrekidis, I.G. (2021). Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learning. arXiv:2106.09004

  • Coghi, M., Torstein, N., Nuesken, N., and Reich, S. (2022). Rough McKean-Vlasov dynamics for robust ensemble Kalman filtering arXiv:2107.06621

  • Spokoiny, V. (2019). Bayesian inference for nonlinear inverse problems. arXiv:1912.12694

  • Spokoiny, V., and Panov, M. (2019). Accuracy of Gaussian approximation in nonparametric Bernstein–vonMises theorem. arXiv:1910.06028

  • Avanesov, V. (2019). How to gamble with non-stationary X-armed bandits and have no regretsarXiv:1908.07636

  • Avanesov, V. (2019). Structural break analysis in high-dimensional covariance structure. arXiv: 1803.00508

  • Avanesov, V. (2019). Nonparametric Change Point Detection in Regression. arXiv:1903.02603

  • Pathiraja, S. and van Leeuwen, P.J. (2018). Model uncertainty estimation in data assimilation for multi-scale systems with partially observed resolved variables, Quarterly Journal of the Royal Meteorological Society, under review, arXiv: 1807.09621

  • Coghi, M., Torstein, N., Nuesken, N., and Reich, S. (2023). Rough McKean-Vlasov dynamics for robust ensemble Kalman filtering, The Annals of Applied Probability, Volume 33, 5693-5752 doi:DOI: 10.1214/23-AAP1957arXiv:2107.06621

  • Dietrich, F., Makeev, A., Kevrekidis, G., Evangelou, N., Bertalan, T., Reich, S., and Kevrekidis, I.G. (2023). Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learningChaos: An Interdisciplinary Journal of Nonlinear Science, Vol. 33, 023121 doi: 10.1063/5.0113632 arXiv:2106.09004

  • Pidstrigach, Jakiw (2022), Score-based generative models detect manifolds In: Advances in Neural Information Processing Systems, Volume 35, arXiv:2206.01018

  • Pidstrigach, Jakiw (2022), Convergence of preconditioned Hamiltonian Monte Carlo on Hilbert spaces, IMA Journal of Numerical Analysis, doi:10.1093/imanum/drac052arXiv:2011.08578

  • Reich, S. (2022). Frequentist perspective on robust parameter estimation using the ensemble Kalman filter In: Chapron, B., Crisan, D., Holm, D., Mémin, E., Radomska, A. (eds) Stochastic Transport in Upper Ocean Dynamics. STUOD 2021. Mathematics of Planet Earth, vol 10. Springer, Cham. doi: 10.1007/978-3-031-18988-3_15 arXiv:2201.000611

  • Gottwald, G.A., and Reich, S. (2021). Combining machine learning and data assimilation to forecast dynamical systems from noisy partial observations Chaos: An Interdisciplinary Journal of Nonlinear Science, Vol. 31, 101103, doi:10.1063/5.0066080 arXiv:2108.03561

  • Gottwald, G., and Reich, S. (2021). Supervised learning from noisy observations: Combining machine-learning techniques with data assimilation. Physica D, Vol. 423, 132911. doi:10.1016/j.physd.2021.132911. arXiv:2007.07383

  • Reich, S., and Rozdeba, P. J. (2020). Posterior contraction rates for non-parametric state and drift estimation. Foundation of Data Science, Vol. 2, 333-349. doi:10.3934/fods.2020016. arXiv:2003.09219

  • Maoutsa, D., Reich, S., and Opper, M. (2020). Interacting particle solutions of Fokker-Planck equations through gradient-log-density estimation. Entropy, Vol. 22, 0802. doi:10.3390/e22080802. arXiv:2006.00702

  • Avanesov, V. (2020) Data-driven confidence bands for distributed nonparametric regression. Proceedings of Machine Learning Research vol. 125. DSpace,  PMLR

  • Garbuno-Inigo, A., Nüsken, N., and Reich, S. (2020). Affine invariant interacting Langevin dynamics for Bayesian inference. SIAM J. Dyn. Syst., Vol. 19(3), 1633-1658. doi:10.1137/19M1304891. arXiv:1912.02859

  • Mariucci, E., Ray, K. and Szabó, B. (2019). A Bayesian nonparametric approach to log-concave density estimation. To appear in Bernoulli. arXiv: 1703.09531

  • Götze, F., Naumov, A., Spokoiny, V. and Ulyanov, V. (2019). Gaussian comparison and anti-concentration inequalities for norms of Gaussian random elements, Bernoulli, in print. arXiv:1708.08663

  • Ty, A.J.A., Fang, Z., Gonzales, R.A., Rozdeba, P.J. and Abarbanel, H.D.I. (2019), Machine Learning of Time Series Using Time-delay Embedding and Precision Annealing. Neural Computation Vol. 31(10), 2004-2024. doi:10.1162/neco_a_01224. arXiv:1902.05062

  • Nüsken, N., Reich, S. and Rozdeba, P. J. (2019). State and parameter estimation from observed signal increments, Entropy, Vol. 21(5), 505. arXiv:1903.10717 ;  doi: 10.3390/e21050505

  • Opper, M. (2019). Variational inference for stochastic differential equations. Ann. Phys., 531(3):1800233, doi:10.1002/andp.201800233

  • Donner, C. and Opper, M. (2018). Efficient Bayesian Inference of Sigmoidal Gaussian Cox Processes, Journal of Machine Learning Research 19, no 67, 1-34. Open Access

  • Donner, C. and Opper, M. (2018). Efficient Bayesian Inference for a Gaussian Process Density Model, Proc. in Conference on Uncertainty in Artificial Intelligence, 2018. Open Access

  • Silin, I. and Spokoiny, V. (2018). Bayesian inference for spectral projectors of the covariance matrix, Electron. J. Statist. 12(1), 1948-1987. doi:10.1214/18-EJS1451arXiv:1711.11532