Hessian matrixIn mathematics, the Hessian matrix, Hessian or (less commonly) Hesse matrix is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. Hesse originally used the term "functional determinants".
Jacobi polynomialsIn mathematics, Jacobi polynomials (occasionally called hypergeometric polynomials) are a class of classical orthogonal polynomials. They are orthogonal with respect to the weight on the interval . The Gegenbauer polynomials, and thus also the Legendre, Zernike and Chebyshev polynomials, are special cases of the Jacobi polynomials. The Jacobi polynomials were introduced by Carl Gustav Jacob Jacobi. The Jacobi polynomials are defined via the hypergeometric function as follows: where is Pochhammer's symbol (for the rising factorial).
Characteristic polynomialIn linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base (that is, the characteristic polynomial does not depend on the choice of a basis).
The Matrix (franchise)The Matrix is an American cyberpunk media franchise consisting of four feature films, beginning with The Matrix (1999) and continuing with three sequels, The Matrix Reloaded, The Matrix Revolutions (both 2003), and The Matrix Resurrections (2021). The first three films were written and directed by the Wachowskis and produced by Joel Silver. The screenplay for the fourth film was written by David Mitchell and Aleksandar Hemon, was directed by Lana Wachowski, and was produced by Grant Hill, James McTeigue, and Lana Wachowski.
Cubic Hermite splineIn numerical analysis, a cubic Hermite spline or cubic Hermite interpolator is a spline where each piece is a third-degree polynomial specified in Hermite form, that is, by its values and first derivatives at the end points of the corresponding domain interval. Cubic Hermite splines are typically used for interpolation of numeric data specified at given argument values , to obtain a continuous function. The data should consist of the desired function value and derivative at each .
System of linear equationsIn mathematics, a system of linear equations (or linear system) is a collection of one or more linear equations involving the same variables. For example, is a system of three equations in the three variables x, y, z. A solution to a linear system is an assignment of values to the variables such that all the equations are simultaneously satisfied. A solution to the system above is given by the ordered triple since it makes all three equations valid. The word "system" indicates that the equations should be considered collectively, rather than individually.
Logarithm of a matrixIn mathematics, a logarithm of a matrix is another matrix such that the matrix exponential of the latter matrix equals the original matrix. It is thus a generalization of the scalar logarithm and in some sense an inverse function of the matrix exponential. Not all matrices have a logarithm and those matrices that do have a logarithm may have more than one logarithm. The study of logarithms of matrices leads to Lie theory since when a matrix has a logarithm then it is in an element of a Lie group and the logarithm is the corresponding element of the vector space of the Lie algebra.
Chebyshev polynomialsThe Chebyshev polynomials are two sequences of polynomials related to the cosine and sine functions, notated as and . They can be defined in several equivalent ways, one of which starts with trigonometric functions: The Chebyshev polynomials of the first kind are defined by Similarly, the Chebyshev polynomials of the second kind are defined by That these expressions define polynomials in may not be obvious at first sight, but follows by rewriting and using de Moivre's formula or by using the angle sum formulas for and repeatedly.
Diagonal matrixIn linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it (a scalar matrix), is a diagonal matrix. A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size).
Symmetric matrixIn linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if denotes the entry in the th row and th column then for all indices and Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.