Cartan subalgebraIn mathematics, a Cartan subalgebra, often abbreviated as CSA, is a nilpotent subalgebra of a Lie algebra that is self-normalising (if for all , then ). They were introduced by Élie Cartan in his doctoral thesis. It controls the representation theory of a semi-simple Lie algebra over a field of characteristic . In a finite-dimensional semisimple Lie algebra over an algebraically closed field of characteristic zero (e.g., ), a Cartan subalgebra is the same thing as a maximal abelian subalgebra consisting of elements x such that the adjoint endomorphism is semisimple (i.
Semisimple Lie algebraIn mathematics, a Lie algebra is semisimple if it is a direct sum of simple Lie algebras. (A simple Lie algebra is a non-abelian Lie algebra without any non-zero proper ideals). Throughout the article, unless otherwise stated, a Lie algebra is a finite-dimensional Lie algebra over a field of characteristic 0. For such a Lie algebra , if nonzero, the following conditions are equivalent: is semisimple; the Killing form, κ(x,y) = tr(ad(x)ad(y)), is non-degenerate; has no non-zero abelian ideals; has no non-zero solvable ideals; the radical (maximal solvable ideal) of is zero.
Cholesky decompositionIn linear algebra, the Cholesky decomposition or Cholesky factorization (pronounced ʃəˈlɛski ) is a decomposition of a Hermitian, positive-definite matrix into the product of a lower triangular matrix and its conjugate transpose, which is useful for efficient numerical solutions, e.g., Monte Carlo simulations. It was discovered by André-Louis Cholesky for real matrices, and posthumously published in 1924. When it is applicable, the Cholesky decomposition is roughly twice as efficient as the LU decomposition for solving systems of linear equations.
HomomorphismIn algebra, a homomorphism is a structure-preserving map between two algebraic structures of the same type (such as two groups, two rings, or two vector spaces). The word homomorphism comes from the Ancient Greek language: ὁμός () meaning "same" and μορφή () meaning "form" or "shape". However, the word was apparently introduced to mathematics due to a (mis)translation of German ähnlich meaning "similar" to ὁμός meaning "same". The term "homomorphism" appeared as early as 1892, when it was attributed to the German mathematician Felix Klein (1849–1925).
Eigendecomposition of a matrixIn linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem. Eigenvalue, eigenvector and eigenspace A (nonzero) vector v of dimension N is an eigenvector of a square N × N matrix A if it satisfies a linear equation of the form for some scalar λ.
Polar decompositionIn mathematics, the polar decomposition of a square real or complex matrix is a factorization of the form , where is a unitary matrix and is a positive semi-definite Hermitian matrix ( is an orthogonal matrix and is a positive semi-definite symmetric matrix in the real case), both square and of the same size. Intuitively, if a real matrix is interpreted as a linear transformation of -dimensional space , the polar decomposition separates it into a rotation or reflection of , and a scaling of the space along a set of orthogonal axes.
Covering groupIn mathematics, a covering group of a topological group H is a covering space G of H such that G is a topological group and the covering map p : G → H is a continuous group homomorphism. The map p is called the covering homomorphism. A frequently occurring case is a double covering group, a topological double cover in which H has index 2 in G; examples include the spin groups, pin groups, and metaplectic groups.
Orthogonal transformationIn linear algebra, an orthogonal transformation is a linear transformation T : V → V on a real inner product space V, that preserves the inner product. That is, for each pair u, v of elements of V, we have Since the lengths of vectors and the angles between them are defined through the inner product, orthogonal transformations preserve lengths of vectors and angles between them. In particular, orthogonal transformations map orthonormal bases to orthonormal bases. Orthogonal transformations are injective: if then , hence , so the kernel of is trivial.
Commutative diagramIn mathematics, and especially in , a commutative diagram is a such that all directed paths in the diagram with the same start and endpoints lead to the same result. It is said that commutative diagrams play the role in category theory that equations play in algebra. A commutative diagram often consists of three parts: (also known as vertices) morphisms (also known as arrows or edges) paths or composites In algebra texts, the type of morphism can be denoted with different arrow usages: A monomorphism may be labeled with a or a .
Table of Lie groupsThis article gives a table of some common Lie groups and their associated Lie algebras. The following are noted: the topological properties of the group (dimension; connectedness; compactness; the nature of the fundamental group; and whether or not they are simply connected) as well as on their algebraic properties (abelian; simple; semisimple). For more examples of Lie groups and other related topics see the list of simple Lie groups; the Bianchi classification of groups of up to three dimensions; see classification of low-dimensional real Lie algebras for up to four dimensions; and the list of Lie group topics.