v2.11.0 (5790)

Cours scientifiques - APM_5OD14_TA : Optimisation coopérative pour les sciences des données

Domaine > Mathématiques appliquées.

Descriptif

The course presents continuous optimization techniques that have been developed to deal with the increasing amount of data. In particular, we look at optimization problems that depend on large-scale datasets, spatially distributed data, as well as local private data.

We will focus on three different aspects: (1) the development of algorithms to decompose the problem into smaller problems that can be solved with some degree of coordination; (2) the trade- off of cooperation vs. local computation; (3) how to design algorithms that ensure privacy of sensitive data.

This course is open to students of the M2 "Data Sciences".

Objectifs pédagogiques

Understand the challenges in cooperative optimization for large-scale data applications.

21 heures en présentiel

33 heures de travail personnel estimé pour l’étudiant.

Diplôme(s) concerné(s)

Format des notes

Numérique sur 20

Littérale/grade réduit

Pour les étudiants du diplôme M2 DS - Data Science

L'UE est acquise si Note finale >= 10
  • Crédits ECTS acquis : 3 ECTS

Programme détaillé

Tentative plan.

#1 Class. Introduction: recap on convex models and algorithms. A model for a network of communicating and computing nodes. Parallel methods in optimization: Gauss method, Jacobi method, incremental methods. Consensus optimization problem.

#2 Class. Distributed optimization (I): primal methods: gradient and gradient tracking. Communication vs. computation trade-off, network scaling.

#3 Class. Distributed optimization (II): dual methods: dual decomposition, ADMM; primal-dual methods and networked problems. Possible example in large-scale smart grids.

#4 Class. Federated optimization (I): the setting and the problem, its relation with distributed optimization and the main differences. Federated averaging and other momentum-based first- order algorithms.

#5 Class. Federated optimization (II): Robustness, Communication vs. computation trade-off, network scaling, relaxations, acceleration, personalization. Possible example with real-data.

#6 Class. Privacy issues in optimization: the concept of privacy and how to enforce it. Differential privacy in distributed optimization and federated optimization. Data attacks, robustness.

Mots clés

Convex optimization; distributed, parallel, and federated algorithms; privacy
Veuillez patienter