Conjugate transposeIn mathematics, the conjugate transpose, also known as the Hermitian transpose, of an complex matrix is an matrix obtained by transposing and applying complex conjugate on each entry (the complex conjugate of being , for real numbers and ). It is often denoted as or or and very commonly in physics as . For real matrices, the conjugate transpose is just the transpose, . The conjugate transpose of an matrix is formally defined by where the subscript denotes the -th entry, for and , and the overbar denotes a scalar complex conjugate.
Main diagonalIn linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix is the list of entries where . All off-diagonal elements are zero in a diagonal matrix. The following four matrices have their main diagonals indicated by red ones: Anti-diagonal matrix The antidiagonal (sometimes counter diagonal, secondary diagonal, trailing diagonal, minor diagonal, off diagonal, or bad diagonal) of an order square matrix is the collection of entries such that for all .
Nilpotent matrixIn linear algebra, a nilpotent matrix is a square matrix N such that for some positive integer . The smallest such is called the index of , sometimes the degree of . More generally, a nilpotent transformation is a linear transformation of a vector space such that for some positive integer (and thus, for all ). Both of these concepts are special cases of a more general concept of nilpotence that applies to elements of rings. The matrix is nilpotent with index 2, since .
Trace (linear algebra)In linear algebra, the trace of a square matrix A, denoted tr(A), is defined to be the sum of elements on the main diagonal (from the upper left to the lower right) of A. The trace is only defined for a square matrix (n × n). It can be proven that the trace of a matrix is the sum of its (complex) eigenvalues (counted with multiplicities). It can also be proven that tr(AB) = tr(BA) for any two matrices A and B. This implies that similar matrices have the same trace.
Weight (representation theory)In the mathematical field of representation theory, a weight of an algebra A over a field F is an algebra homomorphism from A to F, or equivalently, a one-dimensional representation of A over F. It is the algebra analogue of a multiplicative character of a group. The importance of the concept, however, stems from its application to representations of Lie algebras and hence also to representations of algebraic and Lie groups. In this context, a weight of a representation is a generalization of the notion of an eigenvalue, and the corresponding eigenspace is called a weight space.
Laplace expansionIn linear algebra, the Laplace expansion, named after Pierre-Simon Laplace, also called cofactor expansion, is an expression of the determinant of an n × n matrix B as a weighted sum of minors, which are the determinants of some (n − 1) × (n − 1) submatrices of B. Specifically, for every i, where is the entry of the ith row and jth column of B, and is the determinant of the submatrix obtained by removing the ith row and the jth column of B. The term is called the cofactor of in B.
Flag (linear algebra)In mathematics, particularly in linear algebra, a flag is an increasing sequence of subspaces of a finite-dimensional vector space V. Here "increasing" means each is a proper subspace of the next (see filtration): The term flag is motivated by a particular example resembling a flag: the zero point, a line, and a plane correspond to a nail, a staff, and a sheet of fabric. If we write that dimVi = di then we have where n is the dimension of V (assumed to be finite). Hence, we must have k ≤ n.
Elementary matrixIn mathematics, an elementary matrix is a matrix which differs from the identity matrix by one single elementary row operation. The elementary matrices generate the general linear group GLn(F) when F is a field. Left multiplication (pre-multiplication) by an elementary matrix represents elementary row operations, while right multiplication (post-multiplication) represents elementary column operations. Elementary row operations are used in Gaussian elimination to reduce a matrix to row echelon form.
Lie's theoremIn mathematics, specifically the theory of Lie algebras, Lie's theorem states that, over an algebraically closed field of characteristic zero, if is a finite-dimensional representation of a solvable Lie algebra, then there's a flag of invariant subspaces of with , meaning that for each and i. Put in another way, the theorem says there is a basis for V such that all linear transformations in are represented by upper triangular matrices.
Invariant subspaceIn mathematics, an invariant subspace of a linear mapping T : V → V i.e. from some vector space V to itself, is a subspace W of V that is preserved by T; that is, T(W) ⊆ W. Consider a linear mapping An invariant subspace of has the property that all vectors are transformed by into vectors also contained in . This can be stated as Since maps every vector in into Since a linear map has to map A basis of a 1-dimensional space is simply a non-zero vector . Consequently, any vector can be represented as where is a scalar.