v2.2.8 (2209)

PA - C8 - DS-télécom-2 : Visualization and Visual Analytics for Data Science

Descriptif

For the past few years, statistical learning and optimization of complex
dynamical systems with latent data subject to mechanical stess and random
sollicitations prone to be very noisy have been applied to time series analysis
across a wide range of applied science and engineering domains such as signal
processing, target tracking, enhancement and segmentation of speech and
audio signals, inference of ecological networks, etc.
+ Solving Bayesian nonlinear filtering and smoothing problems, i.e. computing
the posterior distributions of some hidden states given a record of
observations, and computing the posterior distributions of the parameters is
crucial to perform maximum likelihood estimation and prediction of future
states of partially observed time series. Estimators of these posterior
distributions may be obtained for instance with Sequential Monte Carlo (SMC),
also known as particle filtering and smoothing, and Markov Chain Monte Carlo
(MCMC) methods.

For massive data sets and complex models, the dynamics of the underlying
latent state or the conditional likelihood of the observations might be
unavailable or might rely on black box routines/simulation programs which
makes usual approaches unreliable. Construction of (a) estimators of these
posterior distributions to estimate uncertainty or (b) simulation under the
posterior distribution of the parameters are very complex challenges in this
setting.

Each Tuesday (9 a.m. to 12.30 p.m.) from the 17 of september to the 22 of
october.
+ Markovian models (specific focus on observation-driven models).
+ Bayesian inference and consistency and asymptotic normality of the
maximum likelihood estimator.
+ Introduction to Markov chain Monte Carlo algorithms.
+ Some convergence results of Markov chain Monte Carlo algorithms.
+ Particle Gibbs sampling, Particle marginal MCMC.
+ Approximate Bayesian Computation.

Master the statistical learning framework and its challenges with dependent
data.
+ Know the inner mechanism of some classical Markovian models with missing
data.
+ Know how to implement (Python) the most classical Markov chain Monte
Carlo algorithms.
Metropolis-Hastings, Gibbs, particle-based MCMC.
+ Understand some theoretical tools used to prove some convergence
properties of Machine learning for such models (maximum likelihood inference,
ergodicity of MCMC algorithms).
Evaluation
+ Report on a research article (100%).

Veuillez patienter