**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Bayesian Estimation

Description

This lecture covers the fundamentals of Bayesian estimation, focusing on the application of Bayes' Theorem in scalar estimation. Topics include Gaussian distributions, posterior probability, and normalization. The lecture also discusses evidence and point estimation.

Login to watch the video

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructors (2)

Related concepts (84)

Related lectures (143)

PHYS-512: Statistical physics of computation

This course covers the statistical physics approach to computer science problems ranging from graph theory and constraint satisfaction to inference and machine learning. In particular the replica and

Bayes' theorem

In probability theory and statistics, Bayes' theorem (beɪz or beɪzɪz ; alternatively Bayes' law or Bayes' rule), and occasionally Bayes's theorem, named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately by conditioning it relative to their age, rather than simply assuming that the individual is typical of the population as a whole.

Gaussian function

In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form and with parametric extension for arbitrary real constants a, b and non-zero c. It is named after the mathematician Carl Friedrich Gauss. The graph of a Gaussian is a characteristic symmetric "bell curve" shape. The parameter a is the height of the curve's peak, b is the position of the center of the peak, and c (the standard deviation, sometimes called the Gaussian RMS width) controls the width of the "bell".

Bayes estimator

In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss). Equivalently, it maximizes the posterior expectation of a utility function. An alternative way of formulating an estimator within Bayesian statistics is maximum a posteriori estimation. Suppose an unknown parameter is known to have a prior distribution .

Multivariate normal distribution

In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem.

Gaussian blur

In , a Gaussian blur (also known as Gaussian smoothing) is the result of blurring an by a Gaussian function (named after mathematician and scientist Carl Friedrich Gauss). It is a widely used effect in graphics software, typically to reduce and reduce detail. The visual effect of this blurring technique is a smooth blur resembling that of viewing the image through a translucent screen, distinctly different from the bokeh effect produced by an out-of-focus lens or the shadow of an object under usual illumination.

Bayesian Inference: Estimation & Demystification

Covers the concepts of Demystification, Estimation, and Bayesian Inference in the context of Bayesian statistics.

Bayesian Inference: Gaussian Variables

Explores Bayesian inference for Gaussian random variables, covering joint distribution, marginal pdfs, and the Bayes classifier.

Monte Carlo: Markov Chains

Covers unsupervised learning, dimensionality reduction, SVD, low-rank estimation, PCA, and Monte Carlo Markov Chains.

Bayesian Inference: Optimal Estimation

Explores optimal Bayesian inference, denoising, scalar estimation, and phase transitions.

Probability and Statistics

Covers p-quantile, normal approximation, joint distributions, and exponential families in probability and statistics.