Chargement Évènements
Chargement Évènements

« Tous les Évènements

  • Cet évènement est passé

Leçons J.-L. Lions 2021 D. Slepčev

16 novembre, 2021 @ 12h00 - 19 novembre, 2021 @ 15h00

Cliquer ici pour la version pdf du programme des Leçons Jacques-Louis Lions 2021 (Dejan Slepčev)
Cliquer ici pour une affichette des dates et LIEUX des Leçons Jacques-Louis Lions 2021 (Dejan Slepčevl)
Cliquer ici pour la version jpg (0,8 Mo) de l’affiche des Leçons Jacques-Louis Lions 2021 (Dejan Slepčev)
Cliquer ici pour la version pdf (17,7 Mo) de l’affiche des Leçons Jacques-Louis Lions 2021 (Dejan Slepčev)

Données par Dejan Slepčev (Université Carnegie Mellon, Pittsburgh) du mardi 16 au vendredi 19 novembre 2021, les Leçons Jacques-Louis Lions 2021 consisteront en :

— un mini-cours
Variational problems and PDE on random structures : analysis and applications to data science
3 séances, mardi 16, mercredi 17 et jeudi 18 novembre 2021 de 12h à 13h15,

— et un colloquium
Machine learning meets calculus of variations
vendredi 19 novembre 2021 de 14h à 15h.

Tous les exposés seront retransmis en temps réel par Zoom.

Les 3 mini-cours seront donnés en présence dans la salle du séminaire du Laboratoire Jacques-Louis Lions :
Campus Jussieu, Sorbonne Université, 4 place Jussieu, Paris 5ème
barre 15-16, 3ème étage, salle 09 (15-16-3-09).

Attention, lieu exceptionnel pour le colloquium :
Le colloquium sera donné en présence dans l’Amphithéâtre Durand, bâtiment Esclangon (*), niveau dalle Jussieu, première porte à gauche immédiatement après l’escalier étroit qui descend le long du mur vers les caves Esclangon.
(*) Le bâtiment Esclangon est situé à l’angle sud-est du Campus Jussieu : prendre à droite en entrant sur le campus par l’entrée principale, place Jussieu ; l’entrée du bâtiment se trouve après la tour 66.

Résumé du mini-cours
Variational problems and PDE on random structures : analysis and applications to data science
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals (defined using the available random sample) which specify the desired properties of the object sought. While the data are often high dimensional, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structures is often encoded by a graph created by connecting the nearby data points. We will introduce mathematical tools used to study variational problems and PDE-based models posed on random data samples. In particular we will discuss the passage from discrete optimization problems on random samples to their continuum limits. This will be used to establish asymptotic consistency of several important machine learning algorithms.
We will cover the basic elements of the background material on calculus of variations and optimal transportation. Furthermore we will develop connections to nonlocal functionals which serve as intermediate objects between the discrete functionals and their continuum limits. We will also consider approaches based on dynamics on graphs and connect these with the evolution equations describing the continuum limits.

Résumé du colloquium
Machine learning meets calculus of variations
Modern data-acquisition technology produces a wealth of data about the world we live in. The goal of machine learning is to extract and interpret the information the data sets contain. This leads to a variety of learning tasks, many of which seek to optimize a functional, defined on the available random sample.
The functionals take as the input the available data samples, yet we seek to make conclusions about the true distribution of data. To compare the outcomes based on finite data and the ideal outcomes that one would have if full information is available, we study the asymptotic properties of the discrete optimization problems based on finite random samples. We will discuss how calculus of variations and partial differential equations provide tools to compare the discrete and continuum descriptions for many relevant problems. Furthermore, we will discuss how the insights from analysis can be used to guide the design of the functionals used in machine learning.

Détails

Début :
16 novembre, 2021 @ 12h00
Fin :
19 novembre, 2021 @ 15h00
Catégorie d’Évènement: