Matrix multiplicationIn mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices A and B is denoted as AB.
SymmetrySymmetry () in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, the term has a more precise definition and is usually used to refer to an object that is invariant under some transformations, such as translation, reflection, rotation, or scaling. Although these two meanings of the word can sometimes be told apart, they are intricately related, and hence are discussed together in this article.
Matrix exponentialIn mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function. It is used to solve systems of linear differential equations. In the theory of Lie groups, the matrix exponential gives the exponential map between a matrix Lie algebra and the corresponding Lie group. Let X be an n×n real or complex matrix. The exponential of X, denoted by eX or exp(X), is the n×n matrix given by the power series where is defined to be the identity matrix with the same dimensions as .
Matrix decompositionIn the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems. In numerical analysis, different decompositions are used to implement efficient matrix algorithms. For instance, when solving a system of linear equations , the matrix A can be decomposed via the LU decomposition.
Bilinear formIn mathematics, a bilinear form is a bilinear map V × V → K on a vector space V (the elements of which are called vectors) over a field K (the elements of which are called scalars). In other words, a bilinear form is a function B : V × V → K that is linear in each argument separately: B(u + v, w) = B(u, w) + B(v, w) and B(λu, v) = λB(u, v) B(u, v + w) = B(u, v) + B(u, w) and B(u, λv) = λB(u, v) The dot product on is an example of a bilinear form.
Symmetric polynomialIn mathematics, a symmetric polynomial is a polynomial P(X1, X2, ..., Xn) in n variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, P is a symmetric polynomial if for any permutation σ of the subscripts 1, 2, ..., n one has P(Xσ(1), Xσ(2), ..., Xσ(n)) = P(X1, X2, ..., Xn). Symmetric polynomials arise naturally in the study of the relation between the roots of a polynomial in one variable and its coefficients, since the coefficients can be given by polynomial expressions in the roots, and all roots play a similar role in this setting.
Matrix ringIn abstract algebra, a matrix ring is a set of matrices with entries in a ring R that form a ring under matrix addition and matrix multiplication . The set of all n × n matrices with entries in R is a matrix ring denoted Mn(R) (alternative notations: Matn(R) and Rn×n). Some sets of infinite matrices form infinite matrix rings. Any subring of a matrix ring is a matrix ring. Over a rng, one can form matrix rngs. When R is a commutative ring, the matrix ring Mn(R) is an associative algebra over R, and may be called a matrix algebra.
Skew-Hermitian matrixNOTOC In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or anti-Hermitian if its conjugate transpose is the negative of the original matrix. That is, the matrix is skew-Hermitian if it satisfies the relation where denotes the conjugate transpose of the matrix . In component form, this means that for all indices and , where is the element in the -th row and -th column of , and the overline denotes complex conjugation.
Symmetric functionIn mathematics, a function of variables is symmetric if its value is the same no matter the order of its arguments. For example, a function of two arguments is a symmetric function if and only if for all and such that and are in the domain of The most commonly encountered symmetric functions are polynomial functions, which are given by the symmetric polynomials. A related notion is alternating polynomials, which change sign under an interchange of variables.
Tridiagonal matrixIn linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main diagonal). For example, the following matrix is tridiagonal: The determinant of a tridiagonal matrix is given by the continuant of its elements. An orthogonal transformation of a symmetric (or Hermitian) matrix to tridiagonal form can be done with the Lanczos algorithm.