**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Linear Combinations: Vectors and Matrices

Description

This lecture introduces the concept of linear combinations of vectors and matrices, exploring how different vectors can be combined to generate new vectors in Rn. The instructor explains the operations of addition, scalar multiplication, and subtraction on vectors, highlighting the properties and geometric interpretations of these operations. Through examples, the lecture demonstrates how linear combinations of vectors can form lines, planes, or grids in Rn, depending on the number of vectors involved. The lecture also covers the use of matrices to represent linear transformations and how matrix-vector multiplication can be interpreted as a combination of the columns of the matrix. The discussion extends to solving systems of linear equations using matrix operations and understanding the span of vectors in Rn.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Instructor

In course

MATH-111(e): Linear Algebra

L'objectif du cours est d'introduire les notions de base de l'algèbre linéaire et ses applications.

Related concepts (144)

Linear independence

In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concepts are central to the definition of dimension. A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space.

Conical combination

Given a finite number of vectors in a real vector space, a conical combination, conical sum, or weighted sum of these vectors is a vector of the form where are non-negative real numbers. The name derives from the fact that a conical sum of vectors defines a cone (possibly in a lower-dimensional subspace). The set of all conical combinations for a given set S is called the conical hull of S and denoted cone(S) or coni(S). That is, By taking k = 0, it follows the zero vector (origin) belongs to all conical hulls (since the summation becomes an empty sum).

Vector space

In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called vectors, may be added together and multiplied ("scaled") by numbers called scalars. Scalars are often real numbers, but can be complex numbers or, more generally, elements of any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called vector axioms. The terms real vector space and complex vector space are often used to specify the nature of the scalars: real coordinate space or complex coordinate space.

Linear span

In mathematics, the linear span (also called the linear hull or just span) of a set S of vectors (from a vector space), denoted span(S), is defined as the set of all linear combinations of the vectors in S. For example, two linearly independent vectors span a plane. The linear span can be characterized either as the intersection of all linear subspaces that contain S, or as the smallest subspace containing S. The linear span of a set of vectors is therefore a vector space itself. Spans can be generalized to matroids and modules.

Rank (linear algebra)

In linear algebra, the rank of a matrix A is the dimension of the vector space generated (or spanned) by its columns. This corresponds to the maximal number of linearly independent columns of A. This, in turn, is identical to the dimension of the vector space spanned by its rows. Rank is thus a measure of the "nondegenerateness" of the system of linear equations and linear transformation encoded by A. There are multiple equivalent definitions of rank. A matrix's rank is one of its most fundamental characteristics.

Related lectures (1,000)

Matrix Equations: Linear Combinations

Covers matrix equations as linear combinations, vector spaces, and geometric interpretations.

Linear Equations: Vectors and MatricesMATH-111(e): Linear Algebra

Covers linear equations, vectors, and matrices, exploring their fundamental concepts and applications.

Linear Algebra: Applications and MatricesMATH-111(e): Linear Algebra

Explores linear algebra concepts through examples and theorems, focusing on matrices and their operations.

Vector Spaces: Axioms and ExamplesMATH-111(e): Linear Algebra

Covers the axioms and examples of vector spaces, including matrices and polynomials.

Matrix Similarity and Diagonalization

Explores matrix similarity, diagonalization, characteristic polynomials, eigenvalues, and eigenvectors in linear algebra.