Random dynamical models & invariant uniform statistics: applications to particle filtering, model assessment and machine learning

Joaquín Miguez, Universidad Carlos III de Madrid 2.29.0.25/0.2613:00 - 14:00

We begin with a class of particle filters (PFs) that can automatically tune their performance by the online evaluation of certain predictive statistics. These statistics are proved to be invariantly uniform for a broad class of state-space models. To be specific, the performance of PFs depends on the number of Monte Carlo samples (known as particles) that they generate and recursively propagate over time. We describe how to adjust this number online, attending to the discrepancy between the sample distribution of the predictive statistics and a discrete uniform distribution. The resulting algorithms are provably consistent and their theoretical error bounds actually adapt to the updates in the number of particles. We present a computer simulation study that illustrates the theoretical results and provides further insights into the complexity, performance and behaviour of the new adaptive PFs.  

The evaluation of the predictive statistics that lies at the core of the methodology is done by generating synthetic observations, i.e., particles in the observation space. When the distribution of the synthetic observations converges to the distribution of the true observations, one can construct a predictive statistic which converges in distribution to a uniform random variable, no matter the observation model or the signal dynamics. This invariance property can be used in applications beyond the adaptation of PFs. In particular, we describe briefly how the proposed methodology can be extended to perform model assessment, design outlier detection algorithms, or train generative implicit models for machine learning.