Wulfram Gerstner is Director of the Laboratory of Computational Neuroscience LCN at the EPFL. His research in computational neuroscience concentrates on models of spiking neurons and spike-timing dependent plasticity, on the problem of neuronal coding in single neurons and populations, as well as on the link between biologically plausible learning rules and behavioral manifestations of learning. He teaches courses for Physicists, Computer Scientists, Mathematicians, and Life Scientists at the EPFL. After studies of Physics in Tübingen and at the Ludwig-Maximilians-University Munich (Master 1989), Wulfram Gerstner spent a year as a visiting researcher in Berkeley. He received his PhD in theoretical physics from the Technical University Munich in 1993 with a thesis on associative memory and dynamics in networks of spiking neurons. After short postdoctoral stays at Brandeis University and the Technical University of Munich, he joined the EPFL in 1996 as assistant professor. Promoted to Associate Professor with tenure in February 2001, he is since August 2006 a full professor with double appointment in the School of Computer and Communication Sciences and the School of Life Sciences. Wulfram Gerstner has been invited speaker at numerous international conferences and workshops. He has served on the editorial board of the Journal of Neuroscience, Network: Computation in Neural Systems',
Journal of Computational Neuroscience', and `Science'.
I am a Senior Researcher at EPFL-CVLab, and, since May 2020, an Artificial Intelligence Engineer at ClearSpace (50%). Previously, I was a Senior Researcher and Research Leader in NICTA's computer vision research group. Prior to this, from Sept. 2010 to Jan 2012, I was a Research Assistant Professor at TTI-Chicago, and, from Feb. 2009 to Aug. 2010, a postdoctoral fellow at ICSI and EECS at UC Berkeley under the supervision of Prof. Trevor Darrell. I obtained my PhD in Jan. 2009 from EPFL under the supervision of Prof. Pascal Fua.
Martin Jaggi is a Tenure Track Assistant Professor at EPFL, heading the Machine Learning and Optimization Laboratory. Before that, he was a post-doctoral researcher at ETH Zurich, at the Simons Institute in Berkeley, and at École Polytechnique in Paris. He has earned his PhD in Machine Learning and Optimization from ETH Zurich in 2011, and a MSc in Mathematics also from ETH Zurich.
Anthony Davison has published on a wide range of topics in statistical theory and methods, and on environmental, biological and financial applications. His main research interests are statistics of extremes, likelihood asymptotics, bootstrap and other resampling methods, and statistical modelling, with a particular focus on the first currently. Statistics of extremes concerns rare events such as storms, high winds and tides, extreme pollution episodes, sporting records, and the like. The subject has a long history, but under the impact of engineering and environmental problems has been an area of intense development in the past 20 years. Davison''s PhD work was in this area, in a project joint between the Departments of Mathematics and Mechanical Engineering at Imperial College, with the aim of modelling potential high exposures to radioactivity due to releases from nuclear installations. The key tools developed, joint with Richard Smith, were regression models for exceedances over high thresholds, which generalized earlier work by hydrologists, and formed the basis of some important later developments. This has led to an ongoing interest in extremes, and in particular their application to environmental and financial data. A major current interest is the development of suitable methods for modelling rare spatio-temporal events, particularly but not only in the context of climate change. Likelihood asymptotics too have undergone very substantial development since 1980. Key tools here have been saddlepoint and related approximations, which can give remarkably accurate approximate distribution and density functions even for very small sample sizes. These approximations can be used for wide classes of parametric models, but also for certain bootstrap and resampling problems. The literature on these methods can seem arcane, but they are potentially widely applicable, and Davison wrote a book joint with Nancy Reid and Alessandra Brazzale intended to promote their use in applications. Bootstrap methods are now used in many areas of application, where they can provide a researcher with accurate inferences tailor-made to the data available, rather than relying on large-sample or other approximations of doubtful validity. The key idea is to replace analytical calculations of biases, variances, confidence and prediction intervals, and other measures of uncertainty with computer simulation from a suitable statistical model. In a nonparametric situation this model consists of the data themselves, and the simulation simply involves resampling from the existing data, while in a parametric case it involves simulation from a suitable parametric model. There is a wide range of possibilities between these extremes, and the book by Davison and Hinkley explores these for many data examples, with the aim of showing how and when resampling methods succeed and why they can fail. He was Editor of Biometrika (2008-2017), Joint Editor of Journal of the Royal Statistical Society, series B (2000-2003), editor of the IMS Lecture Notes Monograph Series (2007), Associate Editor of Biometrika (1987-1999), and Associate Editor of the Brazilian Journal of Probability and Statistics (1987 2006). Currently he on the editorial board of Annual Reviews of Statistics and its Applications. He has served on committees of Royal Statistical Society and of the Institute of Mathematical Statistics. He is an elected Fellow of the American Statistical Assocation and of the Institute of Mathematical Statistics, an elected member of the International Statistical Institute, and a Chartered Statistician. In 2009 he was awarded a laurea honoris causa in Statistical Science by the University of Padova, in 2011 he held a Francqui Chair at Hasselt University, and in 2012 he was Mitchell Lecturer at the University of Glasgow. In 2015 he received the Guy Medal in Silver of the Royal Statistical Society and in 2018 was a Medallion Lecturer of the Institute of Mathematical Statistics.
Henry Markram started a dual scientific and medical career at the University of Cape Town, in South Africa. His scientific work in the 80s revealed the polymodal receptive fields of pontomedullary reticular formation neurons in vivo and how acetylcholine re-organized these sensory maps.
He moved to Israel in 1988 and obtained his PhD at the Weizmann Institute where he discovered a link between acetylcholine and memory mechanisms by being the first to show that acetylcholine modulates the NMDA receptor in vitro studies, and thereby gates which synapses can undergo synaptic plasticity. He was also the first to characterize the electrical and anatomical properties of the cholinergic neurons in the medial septum diagonal band.
He carried out a first postdoctoral study as a Fulbright Scholar at the NIH, on the biophysics of ion channels on synaptic vesicles using sub-fractionation methods to isolate synaptic vesicles and patch-clamp recordings to characterize the ion channels. He carried out a second postdoctoral study at the Max Planck Institute, as a Minerva Fellow, where he discovered that individual action potentials propagating back into dendrites also cause pulsed influx of Ca2 into the dendrites and found that sub-threshold activity could also activated a low threshold Ca2 channel. He developed a model to show how different types of electrical activities can divert Ca2 to activate different intracellular targets depending on the speed of Ca2 influx an insight that helps explain how Ca2 acts as a universal second messenger. His most well known discovery is that of the millisecond watershed to judge the relevance of communication between neurons marked by the back-propagating action potential. This phenomenon is now called Spike Timing Dependent Plasticity (STDP), which many laboratories around the world have subsequently found in multiple brain regions and many theoreticians have incorporated as a learning rule. At the Max-Planck he also started exploring the micro-anatomical and physiological principles of the different neurons of the neocortex and of the mono-synaptic connections that they form - the first step towards a systematic reverse engineering of the neocortical microcircuitry to derive the blue prints of the cortical column in a manner that would allow computer model reconstruction.
He received a tenure track position at the Weizmann Institute where he continued the reverse engineering studies and also discovered a number of core principles of the structural and functional organization such as differential signaling onto different neurons, models of dynamic synapses with Misha Tsodyks, the computational functions of dynamic synapses, and how GABAergic neurons map onto interneurons and pyramidal neurons. A major contribution during this period was his discovery of Redistribution of Synaptic Efficacy (RSE), where he showed that co-activation of neurons does not only alter synaptic strength, but also the dynamics of transmission. At the Weizmann, he also found the tabula rasa principle which governs the random structural connectivity between pyramidal neurons and a non-random functional connectivity due to target selection. Markram also developed a novel computation framework with Wolfgang Maass to account for the impact of multiple time constants in neurons and synapses on information processing called liquid computing or high entropy computing.
In 2002, he was appointed Full professor at the EPFL where he founded and directed the Brain Mind Institute. During this time Markram continued his reverse engineering approaches and developed a series of new technologies to allow large-scale multi-neuron patch-clamp studies. Markrams lab discovered a novel microcircuit plasticity phenomenon where connections are formed and eliminated in a Darwinian manner as apposed to where synapses are strengthening or weakened as found for LTP. This was the first demonstration that neural circuits are constantly being re-wired and excitation can boost the rate of re-wiring.
At the EPFL he also completed the much of the reverse engineering studies on the neocortical microcircuitry, revealing deeper insight into the circuit design and built databases of the blue-print of the cortical column. In 2005 he used these databases to launched the Blue Brain Project. The BBP used IBMs most advanced supercomputers to reconstruct a detailed computer model of the neocortical column composed of 10000 neurons, more than 340 different types of neurons distributed according to a layer-based recipe of composition and interconnected with 30 million synapses (6 different types) according to synaptic mapping recipes. The Blue Brain team built dozens of applications that now allow automated reconstruction, simulation, visualization, analysis and calibration of detailed microcircuits. This Proof of Concept completed, Markrams lab has now set the agenda towards whole brain and molecular modeling.
With an in depth understanding of the neocortical microcircuit, Markram set a path to determine how the neocortex changes in Autism. He found hyper-reactivity due to hyper-connectivity in the circuitry and hyper-plasticity due to hyper-NMDA expression. Similar findings in the Amygdala together with behavioral evidence that the animal model of autism expressed hyper-fear led to the novel theory of Autism called the Intense World Syndrome proposed by Henry and Kamila Markram. The Intense World Syndrome claims that the brain of an Autist is hyper-sensitive and hyper-plastic which renders the world painfully intense and the brain overly autonomous. The theory is acquiring rapid recognition and many new studies have extended the findings to other brain regions and to other models of autism.
Markram aims to eventually build detailed computer models of brains of mammals to pioneer simulation-based research in the neuroscience which could serve to aggregate, integrate, unify and validate our knowledge of the brain and to use such a facility as a new tool to explore the emergence of intelligence and higher cognitive functions in the brain, and explore hypotheses of diseases as well as treatments.
Volkan Cevher received the B.Sc. (valedictorian) in electrical engineering from Bilkent University in Ankara, Turkey, in 1999 and the Ph.D. in electrical and computer engineering from the Georgia Institute of Technology in Atlanta, GA in 2005. He was a Research Scientist with the University of Maryland, College Park from 2006-2007 and also with Rice University in Houston, TX, from 2008-2009. Currently, he is an Associate Professor at the Swiss Federal Institute of Technology Lausanne and a Faculty Fellow in the Electrical and Computer Engineering Department at Rice University. His research interests include machine learning, signal processing theory, optimization theory and methods, and information theory. Dr. Cevher is an ELLIS fellow and was the recipient of the Google Faculty Research award in 2018, the IEEE Signal Processing Society Best Paper Award in 2016, a Best Paper Award at CAMSAP in 2015, a Best Paper Award at SPARS in 2009, and an ERC CG in 2016 as well as an ERC StG in 2011.