Anthony Christopher DavisonAnthony Davison has published on a wide range of topics in statistical theory and methods, and on environmental, biological and financial applications. His main research interests are statistics of extremes, likelihood asymptotics, bootstrap and other resampling methods, and statistical modelling, with a particular focus on the first currently. Statistics of extremes concerns rare events such as storms, high winds and tides, extreme pollution episodes, sporting records, and the like. The subject has a long history, but under the impact of engineering and environmental problems has been an area of intense development in the past 20 years. Davison''s PhD work was in this area, in a project joint between the Departments of Mathematics and Mechanical Engineering at Imperial College, with the aim of modelling potential high exposures to radioactivity due to releases from nuclear installations. The key tools developed, joint with Richard Smith, were regression models for exceedances over high thresholds, which generalized earlier work by hydrologists, and formed the basis of some important later developments. This has led to an ongoing interest in extremes, and in particular their application to environmental and financial data. A major current interest is the development of suitable methods for modelling rare spatio-temporal events, particularly but not only in the context of climate change. Likelihood asymptotics too have undergone very substantial development since 1980. Key tools here have been saddlepoint and related approximations, which can give remarkably accurate approximate distribution and density functions even for very small sample sizes. These approximations can be used for wide classes of parametric models, but also for certain bootstrap and resampling problems. The literature on these methods can seem arcane, but they are potentially widely applicable, and Davison wrote a book joint with Nancy Reid and Alessandra Brazzale intended to promote their use in applications. Bootstrap methods are now used in many areas of application, where they can provide a researcher with accurate inferences tailor-made to the data available, rather than relying on large-sample or other approximations of doubtful validity. The key idea is to replace analytical calculations of biases, variances, confidence and prediction intervals, and other measures of uncertainty with computer simulation from a suitable statistical model. In a nonparametric situation this model consists of the data themselves, and the simulation simply involves resampling from the existing data, while in a parametric case it involves simulation from a suitable parametric model. There is a wide range of possibilities between these extremes, and the book by Davison and Hinkley explores these for many data examples, with the aim of showing how and when resampling methods succeed and why they can fail. He was Editor of Biometrika (2008-2017), Joint Editor of Journal of the Royal Statistical Society, series B (2000-2003), editor of the IMS Lecture Notes Monograph Series (2007), Associate Editor of Biometrika (1987-1999), and Associate Editor of the Brazilian Journal of Probability and Statistics (1987 2006). Currently he on the editorial board of Annual Reviews of Statistics and its Applications. He has served on committees of Royal Statistical Society and of the Institute of Mathematical Statistics. He is an elected Fellow of the American Statistical Assocation and of the Institute of Mathematical Statistics, an elected member of the International Statistical Institute, and a Chartered Statistician. In 2009 he was awarded a laurea honoris causa in Statistical Science by the University of Padova, in 2011 he held a Francqui Chair at Hasselt University, and in 2012 he was Mitchell Lecturer at the University of Glasgow. In 2015 he received the Guy Medal in Silver of the Royal Statistical Society and in 2018 was a Medallion Lecturer of the Institute of Mathematical Statistics.
Bernard MoretBernard M.E. Moret was born in Vevey, Switzerland, received baccalauréats in Latin-Greek and Latin-Mathematics, then did a Diploma in Electrical Engineering at EPFL. After working for 2 years for Omega and Swiss Timing on the development of real-time OS for sports applications, he left for the US. He received his PhD in Electrical Engineering from the U. of Tennessee in 1980 and joined the Department of Computer Science at the University of New Mexico (UNM) that fall. He served as Chairman of the department from 1991 till 1993 and eventually retired in summer 2006 to join the School of Computer and Communication Sciences at EPFL. (You can read about his work at UNM on his (archived) personal and laboratory web pages at UNM.) He was appointed group leader for phylogenetics at the Swiss Institute for Bioinformatics (SIB). From 2009 until his retirement, he was also in charge of the BS and MS programs in Computer Science and Associate Dean for Education. He founded the ACM Journal of Experimental Algorithmics (JEA) and served as its Editor-in-Chief for 7 years; he also helped found the IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB), where he served as Associate Editor until 2008. He founded the annual Workshop on Algorithms in Bioinformatics (WABI) and chairs its steering committee, and he serves on the steering committee of the Workshop on Algorithm Engineering and Experiments (ALENEX). Until summer 2008, he chaired the Biodata Management and Analysis (BDMA) study section of the US National Institutes of Health (NIH); now he is a charter member of the NIH College of Reviewers. He led a team of over 50 biologists, computer scientists, and mathematicians in the CIPRES (Cyber Infrastructure for Phylogenetic Research) project, funded by the US National Science Foundation (NSF) for US$ 12 million over 5 years. He has published nearly 150 papers in computational biology, under funding from the US NSF, the Alfred P. Sloan foundation, the IBM Corporation, the US NIH, the Swiss NSF, and SystemsX.ch. He is a Fellow of the ISCB (International Society for Computational Biology). His Erdös number is 2 and (as of 2020) his h-index is 48.
Pierre DillenbourgA former teacher in elementary school, Pierre Dillenbourg graduated in educational science (University of Mons, Belgium). He started his research on learning technologies in 1984. In 1986, he has been on of the first in the world to apply machine learning to develop a self-improving teaching system. He obtained a PhD in computer science from the University of Lancaster (UK), in the domain of artificial intelligence applications for education. He has been assistant professor at the University of Geneva. He joined EPFL in 2002. He has been the director of Center for Research and Support on Learning and its Technologies, then academic director of Center for Digital Education, which implements the MOOC strategy of EPFL (over 2 million registrations). He is full professor in learning technologies in the School of Computer & Communication Sciences, where he is the head of the CHILI Lab: "Computer-Human Interaction for Learning & Instruction ». He is the director of the leading house DUAL-T, which develops technologies for dual vocational education systems (carpenters, florists,...). With EPFL colleagues, he launched in 2017 the Swiss EdTech Collider, an incubator with 80 start-ups in learning technologies. He (co-)-founded 4 start-ups, does consulting missions in the corporate world and joined the board of several companies or institutions. In 2018, he co-founded LEARN, the EPFL Center of Learning Sciences that brings together the local initiatives in educational innovation. He is a fellow of the International Society for Learning Sciences. He currently is the Associate Vice-President for Education at EPFL.