In probability theory, a normalizing constant or normalizing factor is used to reduce any probability function to a probability density function with total probability of one. For example, a Gaussian function can be normalized into a probability density function, which gives the standard normal distribution. In Bayes' theorem, a normalizing constant is used to ensure that the sum of all possible hypotheses equals 1. Other uses of normalizing constants include making the value of a Legendre polynomial at 1 and in the orthogonality of orthonormal functions. A similar concept has been used in areas other than probability, such as for polynomials. In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g., to make it a probability density function or a probability mass function. If we start from the simple Gaussian function we have the corresponding Gaussian integral Now if we use the latter's reciprocal value as a normalizing constant for the former, defining a function as so that its integral is unit then the function is a probability density function. This is the density of the standard normal distribution. (Standard, in this case, means the expected value is 0 and the variance is 1.) And constant is the normalizing constant of function . Similarly, and consequently is a probability mass function on the set of all nonnegative integers. This is the probability mass function of the Poisson distribution with expected value λ. Note that if the probability density function is a function of various parameters, so too will be its normalizing constant. The parametrised normalizing constant for the Boltzmann distribution plays a central role in statistical mechanics. In that context, the normalizing constant is called the partition function. Bayes' theorem says that the posterior probability measure is proportional to the product of the prior probability measure and the likelihood function.
,