Objective
So-called hidden Markov chain (or state-space) models are time series models involving a "signal" (a Markov process $(X_t)$ describing the state of a system) observed in an imperfect and noisy way in the form of data, e.g. $Y_t=f(X_t)+epsilon_t$. These models are widely used in many disciplines:
- Finance: stochastic volatility ($X_t$ is the unobserved volatility)…
- Engineering: target tracking ($X_t$ is the position of a mobile whose trajectory we are trying to find; speech recognition ($X_t$ is a phoneme).
- Biostatistics: Ecology ($X_t$=population size); Epidemiology ($X_t$=number of infected).
The aim of this course is to present modern methods of sequential analysis of such models, based on particle algorithms (Sequential Monte Carlo). The problems of filtering, smoothing, prediction, and parameter estimation will be discussed. At the end of the course, we will also briefly discuss the extension of such algorithms to non-sequential problems, notably in Bayesian Statistics..
Prerequisite:
- 2A Simulation and Monte Carlo or similar course
- 3A courses of "Computational Statistics" and "Bayesian Statistics" are recommended but not mandatory.
At the end of the course, the student will be able to:
- state the main properties of HMM models
- implement a particle filter to filter and smooth a given HMM model
- estimate the parameters of such a model from different methods
Planning
- Introduction: definition of HMM (Hidden Markov models), main properties, notion of filtering, smoothing and prediction, forward-backward formulas.
- Discrete HMMs, Baum-Petrie's algorithm
- Gaussian linear HMM, Kalman algorithm
- SMC algorithms for filtering an HMM model
- Estimation in HMM models
- Introduction to non-sequential applications of SMC algorithms
Evaluation : Project-based (group of three students).
References
Del Moral (2004). Feynman-Kac formulae, Springer.
Chopin, N. and Papaspiliopoulos, O. (2020). An Introduction to Sequential Monte Carlo, Springer.
- Responsable: Chopin Nicolas