Summary
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concepts are central to the definition of dimension. A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space. A sequence of vectors from a vector space V is said to be linearly dependent, if there exist scalars not all zero, such that where denotes the zero vector. This implies that at least one of the scalars is nonzero, say , and the above equation is able to be written as if and if Thus, a set of vectors is linearly dependent if and only if one of them is zero or a linear combination of the others. A sequence of vectors is said to be linearly independent if it is not linearly dependent, that is, if the equation can only be satisfied by for This implies that no vector in the sequence can be represented as a linear combination of the remaining vectors in the sequence. In other words, a sequence of vectors is linearly independent if the only representation of as a linear combination of its vectors is the trivial representation in which all the scalars are zero. Even more concisely, a sequence of vectors is linearly independent if and only if can be represented as a linear combination of its vectors in a unique way. If a sequence of vectors contains the same vector twice, it is necessarily dependent. The linear dependency of a sequence of vectors does not depend of the order of the terms in the sequence. This allows defining linear independence for a finite set of vectors: A finite set of vectors is linearly independent if the sequence obtained by ordering them is linearly independent. In other words, one has the following result that is often useful.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications (3)

Loading

Loading

Loading

Related people (1)
Related units

No results

Related concepts (35)
Scalar (mathematics)
A scalar is an element of a field which is used to define a vector space. In linear algebra, real numbers or generally elements of a field are called scalars and relate to vectors in an associated vector space through the operation of scalar multiplication (defined in the vector space), in which a vector can be multiplied by a scalar in the defined way to produce another vector. Generally speaking, a vector space may be defined by using any field instead of real numbers (such as complex numbers).
Linear independence
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concepts are central to the definition of dimension. A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space.
Matrix (mathematics)
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a " matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra.
Show more
Related courses (193)
MATH-506: Topology IV.b - cohomology rings
Singular cohomology is defined by dualizing the singular chain complex for spaces. We will study its basic properties, see how it acquires a multiplicative structure and becomes a graded commutative a
MATH-111(e): Linear Algebra
L'objectif du cours est d'introduire les notions de base de l'algèbre linéaire et ses applications.
MATH-111(a): Linear Algebra
L'objectif du cours est d'introduire les notions de base de l'algèbre linéaire et ses applications.
Show more
Related lectures (1,000)
Matrix Similarity and Diagonalization
Explores matrix similarity, diagonalization, characteristic polynomials, eigenvalues, and eigenvectors in linear algebra.
Linear Algebra Fundamentals
Introduces fundamental linear algebra concepts such as equations, matrices, and determinants.
Diagonalization of Matrices
Explores the diagonalization of matrices through eigenvalues and eigenvectors, emphasizing the importance of bases and subspaces.
Show more
Related MOOCs

Loading