Enrolment options

Syllabus :  Nowadays many data learning problems require to analyze the structure of a high-dimensional matrix with remarkable properties; In recommender systems, this could be a column sparse matrix or a low-rank matrix but more sophisticated structures could be considered by combining several notions of sparsity; In graph analysis, popular spectrum techniques to detect cliques are based on the analysis of the Laplacian matrix with specific sparse/low-rank structure. In this course, we will review several mathematical tools useful to develop statistical analysis methods and study their performances. Such tools include concentration inequalities, convex optimization, perturbation theory and minimax theory. 

 

Numerus Clausus : 30

 

Class Time: P2 Wednesday morning

 

Grading – 2.5 ECTS:

Written Exam

Article

 

Topics covered:

  1. Principal Component Analysis
  2. Spectral clustering
  3. Matrix completion
  4. Robust Statistics
  5. Phase Retrieval
  6. Optimal Transport

 

Textbook:

  1. Vershynin. High-Dimensional Probability. Cambridge University.
  2. Gross, Recovering low-rank matrices from few coefficients in any basis, 2011, arXiv:0910.1879
  3. Guedon and R. Vershynin. Community detection in sparse networks viagrothendieck’s inequality.Probability Theory and Related Fields, 165(3-4):1025–1049,2016.
  4. Ma, R. Dudeja, J. Xu, A. Maleki, X. Wang. Spectral Method for Phase Retrieval: an Expectation Propagation Perspective. arXiv: 1903.02505
  5. M. Kouw, M. Loog. An introduction todomain adaptation and transfer learning, 2018. arXiv:1812.11806




Guests cannot access this course. Please log in.