The lecture will mostly be based on the book "Probabilistic Machine Learning: An Introduction" by Kevin Murphy. The following topics will be discussed, for a total of 28 sessions of 2h each.

General theory (4 sessions)

- Introduction to machine learning

- Elements of statistical learning

Linear methods (8 sessions)

- Linear methods for regression (4 sessions): least squares linear regression; regularization to avoid overfitting: ridge and lasso; model selection

- Linear methods for classification (3 sessions): linear discriminant analysis, logistic regression, including SGD for performing maximum likelihood estimations

- Linear methods for unsupervised learning (1 session): PCA and an introduction to factor analysis

Nonlinear methods for supervised learning (10 sessions)

- Decision trees and ensemble methods (3 sessions): classification and regression trees; ensemble learning, including random forests; boosting

- Nonparametric methods (3 sessions): Mercer kernels; Gaussian processes; support vector machines

- Neural networks (4 sessions): feedforward neural networks; training neural networks: computing gradients, using preconditioned gradient methods, regularization strategies, etc.

Nonlinear methods for unsupervised learning (6 sessions)

- Clustering methods (3 sessions): k-nearest neighbors; k-means; hierarchical clustering; Gaussian mixture models

- Nonlinear dimension reduction techniques (3 sessions): autoencoders; manifold learning

- Teaching coordinator: Moulines Eric
- Teaching coordinator: Stoltz Gabriel