Chargement Évènements
Chargement Évènements

« Tous les Évènements

  • Cet évènement est passé

Leçons Jacques-Louis Lions 2017 Emmanuel Candès

14 mars, 2017 @ 11h30 - 17 mars, 2017 @ 15h00

Cliquer ici pour la version jpg (0.4 Mo) de l’affiche des Leçons Jacques-Louis Lions 2017 (Emmanuel Candès)
Cliquer ici pour la version pdf (16 Mo) de l’affiche des Leçons Jacques-Louis Lions 2017 (Emmanuel Candès)

Données par Emmanuel Candès (Université de Stanford) du 14 au 17 mars 2017, les Leçons Jacques-Louis Lions 2017 ont consisté en

— un mini cours
Statistics for the big data era
3 séances, mardi 14, mercredi 15 et jeudi 16 mars 2017, de 11h30 à 13h,
Campus Jussieu, Université Pierre et Marie Curie, 4 place Jussieu, Paris 5ème.
(pdf du Mini-cours 1 – 5.5 Mo)
(pdf du Mini-cours 2 – 9.5 Mo)
(pdf du Mini-cours 3 – 3.5 Mo)

La première séance (mardi 14 mars de 11h30 à 13h00) a eu lieu dans l’amphithéâtre 56 B
La deuxième séance (mercredi 15 mars de 11h30 à 13h00) a eu lieu dans l’amphithéâtre 55 A
La troisième séance (jeudi 16 mars de 11h30 à 13h00) a eu lieu dans la salle du séminaire du Laboratoire Jacques-Louis Lions (15-16-3-09) (retransmission video en salle 15-16-1-01)

— et un colloquium
Around the reproducibility of scientific research in the big data era : what statistics can offer ?
vendredi 17 mars 2017 de 14h à 15h, amphithéâtre 44.
(pdf du colloquium 14.5 Mo)

Abstract of the mini-course
Statistics for the big data era
For a long time, science has operated as follows : a scientific theory can only be empirically tested, and only after it has been advanced. Predictions are deduced from the theory and compared with the results of decisive experiments so that they can be falsified or corroborated. This principle formulated by Karl Popper and operationalized by Ronald Fisher has guided the development of scientific research and statistics for nearly a century. We have, however, entered a new world where large data sets are available prior to the formulation of scientific theories. Researchers mine these data relentlessly in search of new discoveries and it has been observed that we have run into the problem of irreproducibilty. Consider the April 23, 2013 Nature editorial : “Over the past year, Nature has published a string of articles that highlight failures in the reliability and reproducibility of published research.” The field of statistics needs to re-invent itself to adapt to the new reality where scientific hypotheses/theories are generated by data snooping. We will make the case that statistical science is taking on this great challenge and discuss exciting achievements.

Abstract of the colloquium
Around the reproducibility of scientific research in the big data era : what statistics can offer ?
The big data era has created a new scientific paradigm : collect data first, ask questions later. When the universe of scientific hypotheses that are being examined simultaneously is not taken account, inferences are likely to be false. The consequence is that follow up studies are likely not to be able to reproduce earlier reported findings or discoveries. This reproducibility failure bears a substantial cost and this talk is about new statistical tools to address this issue. In the last two decades, statisticians have developed many techniques for addressing this look-everywhere effect, whose proper use would help in alleviating the problems discussed above. This lecture will discuss some of these proposed solutions including the Benjamin-Hochberg procedure for false discovery rate (FDR) control and the knockoff filter, a method which reliably selects which of the many potentially explanatory variables of interest (e.g. the absence or not of a mutation) are indeed truly associated with the response under study (e.g. the log fold increase in HIV-drug resistance).

Détails

Début :
14 mars, 2017 @ 11h30
Fin :
17 mars, 2017 @ 15h00
Catégorie d’Évènement: