Publication

Learning new physics from an imperfect machine

Abstract

We show how to deal with uncertainties on the Standard Model predictions in an agnostic new physics search strategy that exploits artificial neural networks. Our approach builds directly on the specific Maximum Likelihood ratio treatment of uncertainties as nuisance parameters for hypothesis testing that is routinely employed in high-energy physics. After presenting the conceptual foundations of our method, we first illustrate all aspects of its implementation and extensively study its performances on a toy one-dimensional problem. We then show how to implement it in a multivariate setup by studying the impact of two typical sources of experimental uncertainties in two-body final states at the LHC.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (32)
Likelihood function
In statistical inference, the likelihood function quantifies the plausibility of parameter values characterizing a statistical model in light of observed data. Its most typical usage is to compare possible parameter values (under a fixed set of observations and a particular model), where higher values of likelihood are preferred because they correspond to more probable parameter values.
Statistical hypothesis testing
A statistical hypothesis test is a method of statistical inference used to decide whether the data at hand sufficiently support a particular hypothesis. Hypothesis testing allows us to make probabilistic statements about population parameters. While hypothesis testing was popularized early in the 20th century, early forms were used in the 1700s. The first use is credited to John Arbuthnot (1710), followed by Pierre-Simon Laplace (1770s), in analyzing the human sex ratio at birth; see .
Null hypothesis
In scientific research, the null hypothesis (often denoted H0) is the claim that no relationship exists between two sets of data or variables being analyzed. The null hypothesis is that any experimentally observed difference is due to chance alone, and an underlying causative relationship does not exist, hence the term "null". In addition to the null hypothesis, an alternative hypothesis is also developed, which claims that a relationship does exist between two variables.
Show more
Related publications (34)

Learnable latent embeddings for joint behavioural and neural analysis

Mackenzie Mathis, Steffen Schneider, Jin Hwa Lee

Mapping behavioural actions to neural activity is a fundamental goal of neuroscience. As our ability to record large neural and behavioural data increases, there is growing interest in modelling neural dynamics during adaptive behaviours to probe neural re ...
NATURE PORTFOLIO2023

Partial Information Sharing Over Social Learning Networks

Ali H. Sayed, Virginia Bordignon

This work addresses the problem of sharing partial information within social learning strategies. In social learning, agents solve a distributed multiple hypothesis testing problem by performing two operations at each instant: first, agents incorporate inf ...
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2023

Higher Order Asymptotics: Applications to Satellite Conjunction and Boundary Problems

Soumaya Elkantassi

Higher-order asymptotics provide accurate approximations for use in parametric statistical modelling. In this thesis, we investigate using higher-order approximations in two-specific settings, with a particular emphasis on the tangent exponential model. Th ...
EPFL2023
Show more
Related MOOCs (4)
Neuronal Dynamics - Computational Neuroscience of Single Neurons
The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.
Neuronal Dynamics - Computational Neuroscience of Single Neurons
The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.
Selected Topics on Discrete Choice
Discrete choice models are used extensively in many disciplines where it is important to predict human behavior at a disaggregate level. This course is a follow up of the online course “Introduction t
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.