In probability theory and statistics, the normal-gamma distribution (or Gaussian-gamma distribution) is a bivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and precision.
For a pair of random variables, (X,T), suppose that the conditional distribution of X given T is given by
meaning that the conditional distribution is a normal distribution with mean and precision — equivalently, with variance
Suppose also that the marginal distribution of T is given by
where this means that T has a gamma distribution. Here λ, α and β are parameters of the joint distribution.
Then (X,T) has a normal-gamma distribution, and this is denoted by
The joint probability density function of (X,T) is
By construction, the marginal distribution of is a gamma distribution, and the conditional distribution of given is a Gaussian distribution. The marginal distribution of is a three-parameter non-standardized Student's t-distribution with parameters .
The normal-gamma distribution is a four-parameter exponential family with natural parameters and natural statistics .
The following moments can be easily computed using the moment generating function of the sufficient statistic:
where is the digamma function,
If then for any is distributed as
Assume that x is distributed according to a normal distribution with unknown mean and precision .
and that the prior distribution on and , , has a normal-gamma distribution
for which the density pi satisfies
Suppose
i.e. the components of are conditionally independent given and the conditional distribution of each of them given is normal with expected value and variance The posterior distribution of and given this dataset can be analytically determined by Bayes' theorem explicitly,
where is the likelihood of the parameters given the data.
Since the data are i.i.d, the likelihood of the entire dataset is equal to the product of the likelihoods of the individual data samples:
This expression can be simplified as follows:
where , the mean of the data samples, and , the sample variance.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Les étudiants traitent des observations entachées d'incertitude de manière rigoureuse. Ils maîtrisent les principales méthodes de compensation des mesures et d'estimation des paramètres. Ils appliquen
This course presents an introduction to statistical mechanics geared towards materials scientists. The concepts of macroscopic thermodynamics will be related to a microscopic picture and a statistical
This course explores how to design reliable discriminative and generative neural networks, the ethics of data acquisition and model deployment, as well as modern multi-modal models.
Anomaly Detection systems based on Machine and Deep learning are the most promising solutions to detect cyberattacks in the industry. However, these techniques are vulnerable to adversarial attacks that downgrade prediction performance. Several techniques ...
We enable the estimation of the per-axon axial diffusivity from single encoding, strongly diffusion-weighted, pulsed gradient spin echo data. Additionally, we improve the estimation of the per-axon radial diffusivity compared to estimates based on spherica ...
Fault diagnosis plays an essential role in reducing the maintenance costs of rotating machinery manufacturing systems. In many real applications of fault detection and diagnosis, data tend to be imbalanced, meaning that the number of samples for some fault ...