Product ruleIn calculus, the product rule (or Leibniz rule or Leibniz product rule) is a formula used to find the derivatives of products of two or more functions. For two functions, it may be stated in Lagrange's notation as or in Leibniz's notation as The rule may be extended or generalized to products of three or more functions, to a rule for higher-order derivatives of a product, and to other contexts. Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials. (However, J. M.
Quotient ruleIn calculus, the quotient rule is a method of finding the derivative of a function that is the ratio of two differentiable functions. Let , where both f and g are differentiable and The quotient rule states that the derivative of h(x) is It is provable in many ways by using other derivative rules. Given , let , then using the quotient rule: The quotient rule can be used to find the derivative of as follows: Reciprocal rule The reciprocal rule is a special case of the quotient rule in which the numerator .
Covariance matrixIn probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of a given random vector. Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances (i.e., the covariance of each element with itself). Intuitively, the covariance matrix generalizes the notion of variance to multiple dimensions.
Chain ruleIn calculus, the chain rule is a formula that expresses the derivative of the composition of two differentiable functions f and g in terms of the derivatives of f and g. More precisely, if is the function such that for every x, then the chain rule is, in Lagrange's notation, or, equivalently, The chain rule may also be expressed in Leibniz's notation. If a variable z depends on the variable y, which itself depends on the variable x (that is, y and z are dependent variables), then z depends on x as well, via the intermediate variable y.
CovarianceIn probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the lesser values (that is, the variables tend to show similar behavior), the covariance is positive. In the opposite case, when the greater values of one variable mainly correspond to the lesser values of the other, (that is, the variables tend to show opposite behavior), the covariance is negative.