Rotation matrixIn linear algebra, a rotation matrix is a transformation matrix that is used to perform a rotation in Euclidean space. For example, using the convention below, the matrix rotates points in the xy plane counterclockwise through an angle θ about the origin of a two-dimensional Cartesian coordinate system. To perform the rotation on a plane point with standard coordinates v = (x, y), it should be written as a column vector, and multiplied by the matrix R: If x and y are the endpoint coordinates of a vector, where x is cosine and y is sine, then the above equations become the trigonometric summation angle formulae.
Sesquilinear formIn mathematics, a sesquilinear form is a generalization of a bilinear form that, in turn, is a generalization of the concept of the dot product of Euclidean space. A bilinear form is linear in each of its arguments, but a sesquilinear form allows one of the arguments to be "twisted" in a semilinear manner, thus the name; which originates from the Latin numerical prefix sesqui- meaning "one and a half".
Coordinate vectorIn linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers (a tuple) that describes the vector in terms of a particular ordered basis. An easy example may be a position such as (5, 2, 1) in a 3-dimensional Cartesian coordinate system with the basis as the axes of this system. Coordinates are always specified relative to an ordered basis. Bases and their associated coordinate representations let one realize vector spaces and linear transformations concretely as column vectors, row vectors, and matrices; hence, they are useful in calculations.
Minimal polynomial (linear algebra)In linear algebra, the minimal polynomial μA of an n × n matrix A over a field F is the monic polynomial P over F of least degree such that P(A) = 0. Any other polynomial Q with Q(A) = 0 is a (polynomial) multiple of μA. The following three statements are equivalent: λ is a root of μA, λ is a root of the characteristic polynomial χA of A, λ is an eigenvalue of matrix A. The multiplicity of a root λ of μA is the largest power m such that ker((A − λIn)m) strictly contains ker((A − λIn)m−1).
Pivot elementThe pivot or pivot element is the element of a matrix, or an array, which is selected first by an algorithm (e.g. Gaussian elimination, simplex algorithm, etc.), to do certain calculations. In the case of matrix algorithms, a pivot entry is usually required to be at least distinct from zero, and often distant from it; in this case finding this element is called pivoting. Pivoting may be followed by an interchange of rows or columns to bring the pivot to a fixed position and allow the algorithm to proceed successfully, and possibly to reduce round-off error.
Algebra over a fieldIn mathematics, an algebra over a field (often simply called an algebra) is a vector space equipped with a bilinear product. Thus, an algebra is an algebraic structure consisting of a set together with operations of multiplication and addition and scalar multiplication by elements of a field and satisfying the axioms implied by "vector space" and "bilinear". The multiplication operation in an algebra may or may not be associative, leading to the notions of associative algebras and non-associative algebras.
Orthogonal matrixIn linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is where QT is the transpose of Q and I is the identity matrix. This leads to the equivalent characterization: a matrix Q is orthogonal if its transpose is equal to its inverse: where Q−1 is the inverse of Q. An orthogonal matrix Q is necessarily invertible (with inverse Q−1 = QT), unitary (Q−1 = Q∗), where Q∗ is the Hermitian adjoint (conjugate transpose) of Q, and therefore normal (Q∗Q = QQ∗) over the real numbers.
Hermitian adjointIn mathematics, specifically in operator theory, each linear operator on an inner product space defines a Hermitian adjoint (or adjoint) operator on that space according to the rule where is the inner product on the vector space. The adjoint may also be called the Hermitian conjugate or simply the Hermitian after Charles Hermite. It is often denoted by A† in fields like physics, especially when used in conjunction with bra–ket notation in quantum mechanics.
Distributive propertyIn mathematics, the distributive property of binary operations is a generalization of the distributive law, which asserts that the equality is always true in elementary algebra. For example, in elementary arithmetic, one has Therefore, one would say that multiplication distributes over addition. This basic property of numbers is part of the definition of most algebraic structures that have two operations called addition and multiplication, such as complex numbers, polynomials, matrices, rings, and fields.
Spectral theoremIn mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces.