Related concepts (31)
Block matrix
In mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices. Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines, which break it up, or partition it, into a collection of smaller matrices. Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned.
Minor (linear algebra)
In linear algebra, a minor of a matrix A is the determinant of some smaller square matrix, cut down from A by removing one or more of its rows and columns. Minors obtained by removing just one row and one column from square matrices (first minors) are required for calculating matrix cofactors, which in turn are useful for computing both the determinant and inverse of square matrices. The requirement that the square matrix be smaller than the original matrix is often omitted in the definition.
Coordinate vector
In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers (a tuple) that describes the vector in terms of a particular ordered basis. An easy example may be a position such as (5, 2, 1) in a 3-dimensional Cartesian coordinate system with the basis as the axes of this system. Coordinates are always specified relative to an ordered basis. Bases and their associated coordinate representations let one realize vector spaces and linear transformations concretely as column vectors, row vectors, and matrices; hence, they are useful in calculations.
Orthonormality
In linear algebra, two vectors in an inner product space are orthonormal if they are orthogonal (or perpendicular along a line) unit vectors. A set of vectors form an orthonormal set if all vectors in the set are mutually orthogonal and all of unit length. An orthonormal set which forms a basis is called an orthonormal basis. The construction of orthogonality of vectors is motivated by a desire to extend the intuitive notion of perpendicular vectors to higher-dimensional spaces.
Definite quadratic form
In mathematics, a definite quadratic form is a quadratic form over some real vector space V that has the same sign (always positive or always negative) for every non-zero vector of V. According to that sign, the quadratic form is called positive-definite or negative-definite. A semidefinite (or semi-definite) quadratic form is defined in much the same way, except that "always positive" and "always negative" are replaced by "never negative" and "never positive", respectively.
Degenerate bilinear form
In mathematics, specifically linear algebra, a degenerate bilinear form f (x, y ) on a vector space V is a bilinear form such that the map from V to V∗ (the dual space of V ) given by v ↦ (x ↦ f (x, v )) is not an isomorphism. An equivalent definition when V is finite-dimensional is that it has a non-trivial kernel: there exist some non-zero x in V such that for all A nondegenerate or nonsingular form is a bilinear form that is not degenerate, meaning that is an isomorphism, or equivalently in finite dimensions, if and only if for all implies that .
Adjugate matrix
In linear algebra, the adjugate or classical adjoint of a square matrix A is the transpose of its cofactor matrix and is denoted by adj(A). It is also occasionally known as adjunct matrix, or "adjoint", though the latter term today normally refers to a different concept, the adjoint operator which for a matrix is the conjugate transpose. The product of a matrix with its adjugate gives a diagonal matrix (entries not on the main diagonal are zero) whose diagonal entries are the determinant of the original matrix: where I is the identity matrix of the same size as A.
Dual basis
In linear algebra, given a vector space with a basis of vectors indexed by an index set (the cardinality of is the dimension of ), the dual set of is a set of vectors in the dual space with the same index set I such that and form a biorthogonal system. The dual set is always linearly independent but does not necessarily span . If it does span , then is called the dual basis or reciprocal basis for the basis . Denoting the indexed vector sets as and , being biorthogonal means that the elements pair to have an inner product equal to 1 if the indexes are equal, and equal to 0 otherwise.
Transpose of a linear map
In linear algebra, the transpose of a linear map between two vector spaces, defined over the same field, is an induced map between the dual spaces of the two vector spaces. The transpose or algebraic adjoint of a linear map is often used to study the original linear map. This concept is generalised by adjoint functors.
Antihomomorphism
In mathematics, an antihomomorphism is a type of function defined on sets with multiplication that reverses the order of multiplication. An antiautomorphism is a bijective antihomomorphism, i.e. an antiisomorphism, from a set to itself. From bijectivity it follows that antiautomorphisms have inverses, and that the inverse of an antiautomorphism is also an antiautomorphism. Informally, an antihomomorphism is a map that switches the order of multiplication.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.