Descriptif
Syllabus : Nowadays many data learning problems require to analyze the structure of a high-dimensional matrix with remarkable properties; In recommender systems, this could be a column sparse matrix or a low-rank matrix but more sophisticated structures could be considered by combining several notions of sparsity; In graph analysis, popular spectrum techniques to detect cliques are based on the analysis of the Laplacian matrix with specific sparse/low-rank structure. In this course, we will review several mathematical tools useful to develop statistical analysis methods and study their performances. Such tools include concentration inequalities, convex optimization, perturbation theory and minimax theory.
Numerus Clausus : 30
Class Time: P2 Wednesday morning
Grading – 2.5 ECTS:
Written Exam
Article
Topics covered:
- Principal Component Analysis
- Spectral clustering
- Matrix completion
- Robust Statistics
- Phase Retrieval
- Optimal Transport
Textbook:
- Vershynin. High-Dimensional Probability. Cambridge University.
- Gross, Recovering low-rank matrices from few coefficients in any basis, 2011, arXiv:0910.1879
- Guedon and R. Vershynin. Community detection in sparse networks viagrothendieck’s inequality.Probability Theory and Related Fields, 165(3-4):1025–1049,2016.
- Ma, R. Dudeja, J. Xu, A. Maleki, X. Wang. Spectral Method for Phase Retrieval: an Expectation Propagation Perspective. arXiv: 1903.02505
- M. Kouw, M. Loog. An introduction todomain adaptation and transfer learning, 2018. arXiv:1812.11806
effectifs minimal / maximal:
/30Diplôme(s) concerné(s)
Parcours de rattachement
Format des notes
Numérique sur 20Littérale/grade réduitPour les étudiants du diplôme M2 Data Science
Le rattrapage est autorisé (Max entre les deux notes)- Crédits ECTS acquis : 3 ECTS