Exterior algebraIn mathematics, the exterior algebra, or Grassmann algebra, named after Hermann Grassmann, is an algebra that uses the exterior product or wedge product as its multiplication. In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors and , denoted by is called a bivector and lives in a space called the exterior square, a vector space that is distinct from the original space of vectors.
Minimal polynomial (linear algebra)In linear algebra, the minimal polynomial μA of an n × n matrix A over a field F is the monic polynomial P over F of least degree such that P(A) = 0. Any other polynomial Q with Q(A) = 0 is a (polynomial) multiple of μA. The following three statements are equivalent: λ is a root of μA, λ is a root of the characteristic polynomial χA of A, λ is an eigenvalue of matrix A. The multiplicity of a root λ of μA is the largest power m such that ker((A − λIn)m) strictly contains ker((A − λIn)m−1).
Diagonal matrixIn linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it (a scalar matrix), is a diagonal matrix. A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size).
Spectrum of a matrixIn mathematics, the spectrum of a matrix is the set of its eigenvalues. More generally, if is a linear operator on any finite-dimensional vector space, its spectrum is the set of scalars such that is not invertible. The determinant of the matrix equals the product of its eigenvalues. Similarly, the trace of the matrix equals the sum of its eigenvalues. From this point of view, we can define the pseudo-determinant for a singular matrix to be the product of its nonzero eigenvalues (the density of multivariate normal distribution will need this quantity).
Arthur CayleyArthur Cayley (ˈkeɪli; 16 August 1821 – 26 January 1895) was a prolific British mathematician who worked mostly on algebra. He helped found the modern British school of pure mathematics. As a child, Cayley enjoyed solving complex maths problems for amusement. He entered Trinity College, Cambridge, where he excelled in Greek, French, German, and Italian, as well as mathematics. He worked as a lawyer for 14 years. He postulated what is now known as the Cayley–Hamilton theorem—that every square matrix is a root of its own characteristic polynomial, and verified it for matrices of order 2 and 3.
Invertible matrixIn linear algebra, an n-by-n square matrix A is called invertible (also nonsingular, nondegenerate or (rarely used) regular), if there exists an n-by-n square matrix B such that where In denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication. If this is the case, then the matrix B is uniquely determined by A, and is called the (multiplicative) inverse of A, denoted by A−1. Matrix inversion is the process of finding the matrix B that satisfies the prior equation for a given invertible matrix A.
Companion matrixIn linear algebra, the Frobenius companion matrix of the monic polynomial is the square matrix defined as Some authors use the transpose of this matrix, , which is more convenient for some purposes such as linear recurrence relations (see below). is defined from the coefficients of , while the characteristic polynomial as well as the minimal polynomial of are equal to . In this sense, the matrix and the polynomial are "companions". Any matrix A with entries in a field F has characteristic polynomial , which in turn has companion matrix .
Nilpotent matrixIn linear algebra, a nilpotent matrix is a square matrix N such that for some positive integer . The smallest such is called the index of , sometimes the degree of . More generally, a nilpotent transformation is a linear transformation of a vector space such that for some positive integer (and thus, for all ). Both of these concepts are special cases of a more general concept of nilpotence that applies to elements of rings. The matrix is nilpotent with index 2, since .