Anthony Christopher DavisonAnthony Davison has published on a wide range of topics in statistical theory and methods, and on environmental, biological and financial applications. His main research interests are statistics of extremes, likelihood asymptotics, bootstrap and other resampling methods, and statistical modelling, with a particular focus on the first currently. Statistics of extremes concerns rare events such as storms, high winds and tides, extreme pollution episodes, sporting records, and the like. The subject has a long history, but under the impact of engineering and environmental problems has been an area of intense development in the past 20 years. Davison''s PhD work was in this area, in a project joint between the Departments of Mathematics and Mechanical Engineering at Imperial College, with the aim of modelling potential high exposures to radioactivity due to releases from nuclear installations. The key tools developed, joint with Richard Smith, were regression models for exceedances over high thresholds, which generalized earlier work by hydrologists, and formed the basis of some important later developments. This has led to an ongoing interest in extremes, and in particular their application to environmental and financial data. A major current interest is the development of suitable methods for modelling rare spatio-temporal events, particularly but not only in the context of climate change. Likelihood asymptotics too have undergone very substantial development since 1980. Key tools here have been saddlepoint and related approximations, which can give remarkably accurate approximate distribution and density functions even for very small sample sizes. These approximations can be used for wide classes of parametric models, but also for certain bootstrap and resampling problems. The literature on these methods can seem arcane, but they are potentially widely applicable, and Davison wrote a book joint with Nancy Reid and Alessandra Brazzale intended to promote their use in applications. Bootstrap methods are now used in many areas of application, where they can provide a researcher with accurate inferences tailor-made to the data available, rather than relying on large-sample or other approximations of doubtful validity. The key idea is to replace analytical calculations of biases, variances, confidence and prediction intervals, and other measures of uncertainty with computer simulation from a suitable statistical model. In a nonparametric situation this model consists of the data themselves, and the simulation simply involves resampling from the existing data, while in a parametric case it involves simulation from a suitable parametric model. There is a wide range of possibilities between these extremes, and the book by Davison and Hinkley explores these for many data examples, with the aim of showing how and when resampling methods succeed and why they can fail. He was Editor of Biometrika (2008-2017), Joint Editor of Journal of the Royal Statistical Society, series B (2000-2003), editor of the IMS Lecture Notes Monograph Series (2007), Associate Editor of Biometrika (1987-1999), and Associate Editor of the Brazilian Journal of Probability and Statistics (1987 2006). Currently he on the editorial board of Annual Reviews of Statistics and its Applications. He has served on committees of Royal Statistical Society and of the Institute of Mathematical Statistics. He is an elected Fellow of the American Statistical Assocation and of the Institute of Mathematical Statistics, an elected member of the International Statistical Institute, and a Chartered Statistician. In 2009 he was awarded a laurea honoris causa in Statistical Science by the University of Padova, in 2011 he held a Francqui Chair at Hasselt University, and in 2012 he was Mitchell Lecturer at the University of Glasgow. In 2015 he received the Guy Medal in Silver of the Royal Statistical Society and in 2018 was a Medallion Lecturer of the Institute of Mathematical Statistics.
Jean-Yves Le BoudecJean-Yves Le Boudec is full professor at EPFL and fellow of the IEEE. He graduated from Ecole Normale Superieure de Saint-Cloud, Paris, where he obtained the Agregation in Mathematics in 1980 (rank 4) and received his doctorate in 1984 from the University of Rennes, France. From 1984 to 1987 he was with INSA/IRISA, Rennes. In 1987 he joined Bell Northern Research, Ottawa, Canada, as a member of scientific staff in the Network and Product Traffic Design Department. In 1988, he joined the IBM Zurich Research Laboratory where he was manager of the Customer Premises Network Department. In 1994 he joined EPFL as associate professor. His interests are in the performance and architecture of communication systems. In 1984, he developed analytical models of multiprocessor, multiple bus computers. In 1990 he invented the concept called "MAC emulation" which later became the ATM forum LAN emulation project, and developed the first ATM control point based on OSPF. He also launched public domain software for the interworking of ATM and TCP/IP under Linux. He proposed in 1998 the first solution to the failure propagation that arises from common infrastructures in the Internet. He contributed to network calculus, a recent set of developments that forms a foundation to many traffic control concepts in the internet. He earned the Infocom 2005 Best Paper award, with Milan Vojnovic, for elucidating the perfect simulation and stationarity of mobility models, the 2008 IEEE Communications Society William R. Bennett Prize in the Field of Communications Networking, with Bozidar Radunovic, for the analysis of max-min fairness and the 2009 ACM Sigmetrics Best Paper Award, with Augustin Chaintreau and Nikodin Ristanovic, for the mean field analysis of the age of information in gossiping protocols. He is or has been on the program committee or editorial board of many conferences and journals, including Sigcomm, Sigmetrics, Infocom, Performance Evaluation and ACM/IEEE Transactions on Networking. He co-authored the book "Network Calculus" (2001) with Patrick Thiran and is the author of the book "Performance Evaluation of Computer and Communication Systems" (2010).
Michele CeriottiMichele Ceriotti received his Ph.D. in Physics from ETH Zürich in 2010. He spent three years in Oxford as a Junior Research Fellow at Merton College. Since 2013 he leads the laboratory for Computational Science and Modeling in the Institute of Materials at EPFL. His research revolves around the atomic-scale modelling of materials, based on the sampling of quantum and thermal fluctuations and on the use of machine learning to predict and rationalize structure-property relations. He has been awarded the IBM Research Forschungspreis in 2010, the Volker Heine Young Investigator Award in 2013, an ERC Starting Grant in 2016, and the IUPAP C10 Young Scientist Prize in 2018.
François MaréchalPh D. in engineering Chemical process engineer
Researcher and lecturer in the field of computer aided process and energy systems engineering.
Lecturer in the mechanical engineering, electrical engineering and environmental sciences engineering in EPFL.
I'm responsible for the Minor in Energy of EPFL and I'm involved in 3 projects of the Competence Center in Energy and Mobility (2nd generation biofuel, Wood SOFC, and gas turbine development with CO2 mitigation) in which i'm contributing to the energy conversion system design and optimisation.
Short summary of my scientific carrer
After a graduation in chemical engineering from the University of Liège, I have obtained a Ph. D. from the University of Liège in the LASSC laboratory of Prof. Kalitventzeff (former president of the European working party on computer aided process engineering). This laboratory was one of the pioneering laboratory in the field of Computer Aided Process Engineering.
In the group of Professor Kalitventzeff, I have worked on the development and the applications of data reconciliation, process modelling and optimisation techniques in the chemical process industry, my experience ranges from nuclear power stations to chemical plants. In the LASSC, I have been responsible from the developments in the field of rational use of energy in the industry. My first research topic has been the methodological development of process integration techniques, combining the use of pinch based methods and of mathematical programming: e.g. for the design of multiperiod heat exchanger networks or Mixed integer non linear programming techniques for the optimal management of utility systems. Fronted with applications in the industry, my work then mainly concentrated on the optimal integration of utility systems considering not only the energy requirements but the cost of the energy requirements and the energy conversion systems. I developed methods for analysing and integrating the utility system, the steam networks, combustion (including waste fuel), gas turbines or other advanced energy conversion systems (cogeneration, refrigeration and heat). The techniques applied uses operation research tools like mixed integer linear programming and exergy analysis. In order to evaluate the results of the utility integration, a new graphical method for representing the integration of the utility systems has been developed. By the use of MILP techniques, the method developed for the utility integration has been extended to handled site scale problems, to incorporate environmental constraints and reduce the water usage. This method (the Effect Modelling and Optimisation method) has been successfully applied to the chemical plants industry, the pulp and paper industry and the power plant. Instead of focusing on academic problems, I mainly developed my research based on industrial applications that lead to valuable and applicable patented results. Recently the methods developed have been extended to realise the thermoeconomic optimisation of integrated systems like fuel cells. My present R&D work concerns the application of multi-objective optimisation strategies in the design of processes and integrated energy conversion systems.
Since 2001, Im working in the Industrial Energy Systems Laboratory (LENI) of Ecole Polytechnique fédérale de Lausanne (EPFL) where Im leading the R&D activities in the field of Computer Aided Analysis and Design of Industrial Energy Systems with a major focus on sustainable energy conversion system development using thermo-economic optimisation methodologies. A part from the application and the development of process integration techniques, that remains my major field of expertise, the applications concern :
Rational use of water and energy in Industrial processes and industrial production sites : projects with NESTLE, EDF, VEOLIA and Borregaard (pulp and paper).Energy conversion and process design : biofuels from waste biomass (with GASNAT, EGO and PSI), water dessalination and waste water treatment plant (VEOLIA), power plant design (ALSTOM), Energy conversion from geothermal sources (BFE). Integrated energy systems in urban areas : together with SCANE and SIG (GE) and IEA annexe 42 for micro-cogeneration systems.
I as well contributed to the definition of the 2000 Watt society and to studies concerning the emergence of green technologies on the market in the frame of the Alliance for Global Sustainability.
Devis TuiaI come from Ticino and studied in Lausanne, between UNIL and EPFL. After my PhD at UNIL in remote sensing, I was postdoc in Valencia (Spain), Boulder (CO) and EPFL, working on model adaptation and prior knowledge integration in machine learning. In 2014 I became Research Assistant Professor at University of Zurich, where I started the 'multimodal remote sensing' group. In 2017, I joined Wageningen University (NL), where I was professor of the GeoInformation Science and Remote Sensing Laboratory. Since 2020, I joined EPFL Valais, to start the ECEO lab, working at the interface between Earth observation, machine learning and environmental sciences.
Olivier SchneiderAfter his thesis defense in particle physics in 1989 at University of Lausanne, Olivier Schneider joins LBL, the Lawrence Berkeley Laboratory (California), to work on the CDF experiment at the Tevatron in Fermilab (Illinois), first as a research fellow supported by the Swiss National Science Foundation, and later as a post-doc at LBL. He participates in the construction and commissioning of the first silicon vertex detector to operate successfully at a hadron collider; this detector enabled the discovery of the sixth quark, named "top". Since 1994, he comes back to Europe and participates in the ALEPH experiment at CERN's Large Electron-Positron Collider, as CERN fellow and then as CERN scientific staff. He specializes in heavy flavour physics. In 1998, he becomes associate professor at University of Lausanne, then extraordinary professor at the Swiss Institute of Technology Lausanne (EPFL) in 2003, and finally full professor at EPFL in 2010. Having worked since 1997 on the preparation of the LHCb experiment at CERN's Large Hadron Collider, which started operation in 2009, he is now analyzing the first data. He also contributes since 2001 to the exploitation of the data recorded at the Belle experiment (KEK laboratory, Tsukuba, Japan). These two experiments study mainly the decays of hadrons containing a b quark, as well CP violation, i.e. the non-invariance under the symmetry between matter and antimatter.