Hurwitz's theorem (composition algebras)In mathematics, Hurwitz's theorem is a theorem of Adolf Hurwitz (1859–1919), published posthumously in 1923, solving the Hurwitz problem for finite-dimensional unital real non-associative algebras endowed with a positive-definite quadratic form. The theorem states that if the quadratic form defines a homomorphism into the positive real numbers on the non-zero part of the algebra, then the algebra must be isomorphic to the real numbers, the complex numbers, the quaternions, or the octonions.
Composition algebraIn mathematics, a composition algebra A over a field K is a not necessarily associative algebra over K together with a nondegenerate quadratic form N that satisfies for all x and y in A. A composition algebra includes an involution called a conjugation: The quadratic form is called the norm of the algebra. A composition algebra (A, ∗, N) is either a division algebra or a split algebra, depending on the existence of a non-zero v in A such that N(v) = 0, called a null vector. When x is not a null vector, the multiplicative inverse of x is .
BiquaternionIn abstract algebra, the biquaternions are the numbers w + x i + y j + z k, where w, x, y, and z are complex numbers, or variants thereof, and the elements of {1, i, j, k} multiply as in the quaternion group and commute with their coefficients. There are three types of biquaternions corresponding to complex numbers and the variations thereof: Biquaternions when the coefficients are complex numbers. Split-biquaternions when the coefficients are split-complex numbers. Dual quaternions when the coefficients are dual numbers.
Linear complex structureIn mathematics, a complex structure on a real vector space V is an automorphism of V that squares to the minus identity, −I. Such a structure on V allows one to define multiplication by complex scalars in a canonical fashion so as to regard V as a complex vector space. Every complex vector space can be equipped with a compatible complex structure, however, there is in general no canonical such structure. Complex structures have applications in representation theory as well as in complex geometry where they play an essential role in the definition of almost complex manifolds, by contrast to complex manifolds.
Symplectic vector spaceIn mathematics, a symplectic vector space is a vector space V over a field F (for example the real numbers R) equipped with a symplectic bilinear form. A symplectic bilinear form is a mapping ω : V × V → F that is Bilinear Linear in each argument separately; Alternating ω(v, v) = 0 holds for all v ∈ V; and Non-degenerate ω(u, v) = 0 for all v ∈ V implies that u = 0. If the underlying field has characteristic not 2, alternation is equivalent to skew-symmetry. If the characteristic is 2, the skew-symmetry is implied by, but does not imply alternation.
Gamma matricesIn mathematical physics, the gamma matrices, also called the Dirac matrices, are a set of conventional matrices with specific anticommutation relations that ensure they generate a matrix representation of the Clifford algebra It is also possible to define higher-dimensional gamma matrices. When interpreted as the matrices of the action of a set of orthogonal basis vectors for contravariant vectors in Minkowski space, the column vectors on which the matrices act become a space of spinors, on which the Clifford algebra of spacetime acts.
Matrix (mathematics)In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a " matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra.