The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. The normal distribution is recovered as q → 1. The q-Gaussian has been applied to problems in the fields of statistical mechanics, geology, anatomy, astronomy, economics, finance, and machine learning. The distribution is often favored for its heavy tails in comparison to the Gaussian for 1 < q < 3. For the q-Gaussian distribution is the PDF of a bounded random variable. This makes in biology and other domains the q-Gaussian distribution more suitable than Gaussian distribution to model the effect of external stochasticity. A generalized q-analog of the classical central limit theorem was proposed in 2008, in which the independence constraint for the i.i.d. variables is relaxed to an extent defined by the q parameter, with independence being recovered as q → 1. However, a proof of such a theorem is still lacking. In the heavy tail regions, the distribution is equivalent to the Student's t-distribution with a direct mapping between q and the degrees of freedom. A practitioner using one of these distributions can therefore parameterize the same distribution in two different ways. The choice of the q-Gaussian form may arise if the system is non-extensive, or if there is lack of a connection to small samples sizes. The standard q-Gaussian has the probability density function where is the q-exponential and the normalization factor is given by Note that for the q-Gaussian distribution is the PDF of a bounded random variable. For cumulative density function is where is the hypergeometric function. As the hypergeometric function is defined for z < 1 but x is unbounded, Pfaff transformation could be used.