In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite. Given and , the mean and the variance of , respectively, a Taylor expansion of the expected value of can be found via Since the second term vanishes. Also, is . Therefore, It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example, Similarly, The above is obtained using a second order approximation, following the method used in estimating the first moment. It will be a poor approximation in cases where is highly non-linear. This is a special case of the delta method. Indeed, we take . With , we get . The variance is then computed using the formula An example is, The second order approximation, when X follows a normal distribution, is: To find a second-order approximation for the covariance of functions of two random variables (with the same function applied to both), one can proceed as follows. First, note that . Since a second-order expansion for has already been derived above, it only remains to find . Treating as a two-variable function, the second-order Taylor expansion is as follows: Taking expectation of the above and simplifying—making use of the identities and —leads to . Hence, If X is a random vector, the approximations for the mean and variance of are given by Here and denote the gradient and the Hessian matrix respectively, and is the covariance matrix of X.