**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Bayesian probability

Summary

Bayesian probability (ˈbeɪziən or ˈbeɪʒən ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.
The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown. In the Bayesian view, a probability is assigned to a hypothesis, whereas under frequentist inference, a hypothesis is typically tested without being assigned a probability.
Bayesian probability belongs to the category of evidential probabilities; to evaluate the probability of a hypothesis, the Bayesian probabilist specifies a prior probability. This, in turn, is then updated to a posterior probability in the light of new, relevant data (evidence). The Bayesian interpretation provides a standard set of pr

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading

Related people (9)

Related publications (71)

Loading

Loading

Loading

Related units (8)

Related concepts (49)

Statistics

Statistics (from German: Statistik, "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and present

Probability

Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number

Bayesian inference

Bayesian inference (ˈbeɪziən or ˈbeɪʒən ) is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes avail

Related courses (47)

MGT-418: Convex optimization

This course introduces the theory and application of modern convex optimization from an engineering perspective.

MATH-517: Statistical computation and visualisation

The course will provide the opportunity to tackle real world problems requiring advanced computational skills and visualisation techniques to complement statistical thinking. Students will practice proposing efficient solutions, and effectively communicating the results with stakeholders.

PHYS-467: Machine learning for physicists

Machine learning and data analysis are becoming increasingly central in sciences including physics. In this course, fundamental principles and methods of machine learning will be introduced and practised.

Romain Essy Théo Gratier De Saint-Louis

In this thesis, we assess a new framework called UMIN on a data-driven optimization problem. Such a problem happens recurrently in real life and can quickly become dicult to model when the input has a high dimensionality as images for instance. From the architecture of aircraft to the design of proteins, a great number of dierent techniques have already been explored. Based on former solutions, this work introduces a brand new Bayesian approach that updates previous frameworks. Former model architectures use generative adversarial networks on one side and a forward model on the other side to improve the accuracy of the results. However, employing a Bayesian network allows us to leverage its uncertainty estimates to enhance the accuracy of the results and also to reduce unrealistic samples output by the generator. By creating new experiments on a modern MNIST dataset and by reproducing former works taken as baseline, we show that the framework introduces in this work outperforms the previous method. The whole code is available at the following url: https://github.com/RomainGratier/Black-box_Optimization_via_Deep_ Generative-Exploratory_Networks.

2020This thesis is a contribution to financial statistics. One of the principal concerns of investors is the evaluation of portfolio risk. The notion of risk is vague, but in finance it is always linked to possible losses. In this thesis, we present some measures allowing the valuation of risk with the help of Bayesian methods. An exploratory analysis of data is presented to describe the sampling properties of financial time series. This analysis allows us to understand the origins of the daily returns studied in this thesis. Moreover, a discussion of different models is presented. These models make strong assumptions on investor behaviour, which are not always satisfied. This exploratory analysis shows some differences between the behaviour anticipated under equilibrium models, and that of real data. The Bayesian approach has been chosen because it allows one to incorporate all the variability, in particular that associated with model choice. The models studied in this thesis allow one to take heteroskedasticity into account, as well as particular shapes of the tails of returns. ARCH type models and models based on extreme value theory are studied. One original aspect of this thesis is its use of Bayesian analysis to detect change points in financial time series. We suppose that a market has two phases, and that it switches from a state to the other at random. Another new contribution is a model integrating heteroskedasticity and time dependence of extreme values, by superposition of the model proposed by Bortot and Coles (2003) and a GARCH process. This thesis uses simulation intensively for the estimation of risk measures. The drawback of simulation is the amount of time needed to obtain accurate estimates. However, simulation allows one to produce results when direct calculation is not feasible. For example, simulation allows one to compute risk estimates for time horizons greater than one day. The methods presented in this thesis are illustrated on simulated data, and on real data from European and American markets. This thesis involved the construction of a library containing C and S code to perform risk analysis using GARCH and extreme value theory models. The results show that model uncertainty can be incorporated, and that risk measures for time horizons greater than one can be obtained by simulation. The methods presented in this thesis have a natural representation involving conditioning. Thus, they permit the computation of both conditional and unconditional risk estimates. Three methods are described: the GARCH method; the two-state GARCH method; and the HBC method. Unconditional risk estimation using the GARCH method is satisfactory on data which seem stationary, but not reliable on data which are non-stationary, such as data with change points. The two-state GARCH model does a little better, but gives very satisfactory results when the risk is estimated conditionally on time. The HBC method does not give satisfactory results.

The present work is related to the recent research topics in hydrology devoted to the integration of field knowledge into the hydrological modelling. The study catchment is the Haute-Mentue experimental basin (12.5 km2) located in western Switzerland, in the Plateau region. In order to complete the existing knowledge about the hydrological behaviour of the study catchment, a field experimental approach has been conducted at two scales: catchment (environmental tracing) and local scale (TDR soil moisture measurements). The environmental tracing application has led to the same conclusion as previous researches: hydrological behaviour is strongly influenced by the catchment antecedent conditions and by the rainfall duration and intensity. The geology characteristics (moraine or molasse) explain the main differences in the hydrological behaviour that have been observed so far. As the environmental tracing does not allow easy identification of the mechanisms responsible for the runoff generation, TDR equipments have been installed on two hillslopes with different geological characteristics, which allowed monitoring of the soil moisture at different depths along the hillslope during two intensive campaigns in 2002 and 2003. Association of the environmental tracing and TDR technique has finally allowed precising the conceptual model of two head sub-catchments of the Haute-Mentue catchment. The second part of the research is devoted to the hydrological modelling. A simple conceptual model (TOPMODEL) has been considered as an appropriate representation of the hydrological processes on the Haute-Mentue catchment. In order to estimate TOPMODEL parameters and to take into account uncertainty associated with estimated parameters and model output, a Bayesian approach has been proposed and two Bayesian techniques have been compared: GLUE (Generalized Likelihood Uncertainty Estimation) and MCMC (Monte Carlo Markov Chains). The role of the statistical corrections on the resulting parameters and model output uncertainty has been assessed. In the last part of the present research, the Bayesian methodology has been extended to the case of multi-response calibration. Previous field acquired knowledge (i.e. soil storage saturation deficit, stream water silica and calcium concentrations) has been used to constrain parametrization of the classical and of a modified version of TOPMODEL. In both cases, multi-calibration led to trade-off behaviour of the efficiencies of the simulated responses. The total modelling uncertainty of the new introduced responses was considerably reduced at the expense of an increase in the total modelling uncertainty of the simulated discharges.

Related lectures (68)