v2.11.0 (5518)

Cours scientifiques - MAP566A : Introduction to Machine Learning

Domaine > Mathématiques appliquées.

Descriptif

The lecture will mostly be based on the book "Probabilistic Machine Learning: An Introduction" by Kevin Murphy. Lecture notes are provided. The following topics will be discussed:
- Introduction to machine learning: Motivations, elements of statistical learning, K-nearest neighbors as a first learning method
- Least square regression: Simple regression, regularization to avoid overfitting (Ridge and LASSO)
- Classification with logistic regression
- Stochastic gradient methods
- Principal component analysis
- Support vector machines, including an introduction to kernel methods
- Trees and ensemble methods, including boosting
- Neural networks
- Clustering methods

Objectifs pédagogiques

The lecture will mostly be based on the book "Probabilistic Machine Learning: An Introduction" by Kevin Murphy. Lecture notes are provided, as well as exercise sheets.

This lecture provides an introduction to various concepts and algorithms in Machine Learning -- a field where data is used to gain experience or make predictions (as opposed to fields where data is produced as a result of modeling, e.g. when modeling natural phenomena using partial differential equations). Tasks include classification, regression, ranking, clustering, dimensionality reduction. This can be done in a supervised or unsupervised manner, depending on whether data is labelled or not.

From a mathematical viewpoint, Machine Learning is a very broad field, which uses elements from linear algebra and optimization to devise numerical methods, and techniques of scientific computing and computer science to implement efficient algorithms. From a more abstract perspective, elements of statistical learning theory allow to give rigorous foundations to the various methods at hand, by formalizing concepts such as risk minimization, the bias-complexity tradeoff, overfitting, cross-validation, etc.

Hands-on sessions in the form of jupyter notebooks complement the lecture. They allow students to assess the successes and limitations of the most popular methods, first on toy synthetic data (for the ease of visualization and a complete understanding of the behavior of the algorithms), and then on more relevant datasets such as MNIST. From a practical perspective, this will be done from scratch for simple methods, and using Scikit-learn for more advanced techniques except neural network models for which PyTorch will be used.

56 heures en présentiel (28 blocs ou créneaux)

Diplôme(s) concerné(s)

domaines Saclay

ODD 15 Vie terrestre.

Pour les étudiants du diplôme M1 Mathématiques Appliquées et Statistiques

Pré-requis : Cours de base en probabilités et statistiques.

Format des notes

Numérique sur 20

Littérale/grade réduit

Pour les étudiants du diplôme M1 Mathématiques Appliquées et Statistiques

Le rattrapage est autorisé (Note de rattrapage conservée)
    L'UE est acquise si Note finale >= 10
    • Crédits ECTS acquis : 6 ECTS

    Programme détaillé

    The following topics will be discussed:
    - Introduction to machine learning: Motivations, elements of statistical learning, K-nearest neighbors as a first learning method
    - Least square regression: Simple regression, regularization to avoid overfitting (Ridge and LASSO)
    - Classification with logistic regression, with a discussion on convexification of the loss
    - Stochastic gradient methods (deterministic gradient methods, momentum methods, stochastic gradients, Adam)
    - Principal component analysis (derivation from the perspective of reconstruction error, statistical interpretation)
    - Support vector machines, including an introduction to kernel methods and reproducing kernel Hilbert spaces
    - Trees and ensemble methods, including bagging and boosting
    - Neural networks for supervised learning (architectures, training algorithms) and unsupervised learning (autoencoders)
    - Clustering methods (K-means, hierarchical clustering, mixture models, density based clustering, spectral methods)

    Veuillez patienter