Strassen algorithmIn linear algebra, the Strassen algorithm, named after Volker Strassen, is an algorithm for matrix multiplication. It is faster than the standard matrix multiplication algorithm for large matrices, with a better asymptotic complexity, although the naive algorithm is often better for smaller matrices. The Strassen algorithm is slower than the fastest known algorithms for extremely large matrices, but such galactic algorithms are not useful in practice, as they are much slower for matrices of practical size.
Pure mathematicsPure mathematics is the study of mathematical concepts independently of any application outside mathematics. These concepts may originate in real-world concerns, and the results obtained may later turn out to be useful for practical applications, but pure mathematicians are not primarily motivated by such applications. Instead, the appeal is attributed to the intellectual challenge and aesthetic beauty of working out the logical consequences of basic principles.
CoefficientIn mathematics, a coefficient is a multiplicative factor involved in some term of a polynomial, a series, or an expression. It may be a number (dimensionless), in which case it is known as a numerical factor. It may also be a constant with units of measurement, in which it is known as a constant multiplier. In general, coefficients may be any expression (including variables such as a, b and c). When the combination of variables and constants is not necessarily involved in a product, it may be called a parameter.
Indefinite orthogonal groupIn mathematics, the indefinite orthogonal group, O(p, q) is the Lie group of all linear transformations of an n-dimensional real vector space that leave invariant a nondegenerate, symmetric bilinear form of signature (p, q), where n = p + q. It is also called the pseudo-orthogonal group or generalized orthogonal group. The dimension of the group is n(n − 1)/2. The indefinite special orthogonal group, SO(p, q) is the subgroup of O(p, q) consisting of all elements with determinant 1.
QR decompositionIn linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix A into a product A = QR of an orthonormal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the QR algorithm. Any real square matrix A may be decomposed as where Q is an orthogonal matrix (its columns are orthogonal unit vectors meaning ) and R is an upper triangular matrix (also called right triangular matrix).
Jordan normal formIn linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal (on the superdiagonal), and with identical diagonal entries to the left and below them. Let V be a vector space over a field K.
Additive inverseIn mathematics, the additive inverse of a number a (sometimes called the opposite of a) is the number that, when added to a, yields zero. The operation taking a number to its additive inverse is known as sign change or negation. For a real number, it reverses its sign: the additive inverse (opposite number) of a positive number is negative, and the additive inverse of a negative number is positive. Zero is the additive inverse of itself. The additive inverse of a is denoted by unary minus: −a (see also below).
Second derivativeIn calculus, the second derivative, or the second-order derivative, of a function f is the derivative of the derivative of f. Informally, the second derivative can be phrased as "the rate of change of the rate of change"; for example, the second derivative of the position of an object with respect to time is the instantaneous acceleration of the object, or the rate at which the velocity of the object is changing with respect to time. In Leibniz notation: where a is acceleration, v is velocity, t is time, x is position, and d is the instantaneous "delta" or change.
Identity elementIn mathematics, an identity element or neutral element of a binary operation is an element that leaves unchanged every element when the operation is applied. For example, 0 is an identity element of the addition of real numbers. This concept is used in algebraic structures such as groups and rings. The term identity element is often shortened to identity (as in the case of additive identity and multiplicative identity) when there is no possibility of confusion, but the identity implicitly depends on the binary operation it is associated with.
Change of basisIn mathematics, an ordered basis of a vector space of finite dimension n allows representing uniquely any element of the vector space by a coordinate vector, which is a sequence of n scalars called coordinates. If two different bases are considered, the coordinate vector that represents a vector v on one basis is, in general, different from the coordinate vector that represents v on the other basis. A change of basis consists of converting every assertion expressed in terms of coordinates relative to one basis into an assertion expressed in terms of coordinates relative to the other basis.