Block matrixIn mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices. Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines, which break it up, or partition it, into a collection of smaller matrices. Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned.
Definite matrixIn mathematics, a symmetric matrix with real entries is positive-definite if the real number is positive for every nonzero real column vector where is the transpose of . More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number is positive for every nonzero complex column vector where denotes the conjugate transpose of Positive semi-definite matrices are defined similarly, except that the scalars and are required to be positive or zero (that is, nonnegative).
Nilpotent matrixIn linear algebra, a nilpotent matrix is a square matrix N such that for some positive integer . The smallest such is called the index of , sometimes the degree of . More generally, a nilpotent transformation is a linear transformation of a vector space such that for some positive integer (and thus, for all ). Both of these concepts are special cases of a more general concept of nilpotence that applies to elements of rings. The matrix is nilpotent with index 2, since .
Circulant matrixIn linear algebra, a circulant matrix is a square matrix in which all row vectors are composed of the same elements and each row vector is rotated one element to the right relative to the preceding row vector. It is a particular kind of Toeplitz matrix. In numerical analysis, circulant matrices are important because they are diagonalized by a discrete Fourier transform, and hence linear equations that contain them may be quickly solved using a fast Fourier transform.
Companion matrixIn linear algebra, the Frobenius companion matrix of the monic polynomial is the square matrix defined as Some authors use the transpose of this matrix, , which is more convenient for some purposes such as linear recurrence relations (see below). is defined from the coefficients of , while the characteristic polynomial as well as the minimal polynomial of are equal to . In this sense, the matrix and the polynomial are "companions". Any matrix A with entries in a field F has characteristic polynomial , which in turn has companion matrix .
Commuting matricesIn linear algebra, two matrices and are said to commute if , or equivalently if their commutator is zero. A set of matrices is said to commute if they commute pairwise, meaning that every pair of matrices in the set commute with each other. Commuting matrices preserve each other's eigenspaces. As a consequence, commuting matrices over an algebraically closed field are simultaneously triangularizable; that is, there are bases over which they are both upper triangular.
Defective matrixIn linear algebra, a defective matrix is a square matrix that does not have a complete basis of eigenvectors, and is therefore not diagonalizable. In particular, an n × n matrix is defective if and only if it does not have n linearly independent eigenvectors. A complete basis is formed by augmenting the eigenvectors with generalized eigenvectors, which are necessary for solving defective systems of ordinary differential equations and other problems.
Transformation matrixIn linear algebra, linear transformations can be represented by matrices. If is a linear transformation mapping to and is a column vector with entries, then for some matrix , called the transformation matrix of . Note that has rows and columns, whereas the transformation is from to . There are alternative expressions of transformation matrices involving row vectors that are preferred by some authors. Matrices allow arbitrary linear transformations to be displayed in a consistent format, suitable for computation.
Polar decompositionIn mathematics, the polar decomposition of a square real or complex matrix is a factorization of the form , where is a unitary matrix and is a positive semi-definite Hermitian matrix ( is an orthogonal matrix and is a positive semi-definite symmetric matrix in the real case), both square and of the same size. Intuitively, if a real matrix is interpreted as a linear transformation of -dimensional space , the polar decomposition separates it into a rotation or reflection of , and a scaling of the space along a set of orthogonal axes.
Schur decompositionIn the mathematical discipline of linear algebra, the Schur decomposition or Schur triangulation, named after Issai Schur, is a matrix decomposition. It allows one to write an arbitrary complex square matrix as unitarily equivalent to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix. The Schur decomposition reads as follows: if A is an n × n square matrix with complex entries, then A can be expressed as where Q is a unitary matrix (so that its inverse Q−1 is also the conjugate transpose Q* of Q), and U is an upper triangular matrix, which is called a Schur form of A.