**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Convexity: Functions and Global Minima

Description

This lecture covers the concept of convex functions on linear spaces, including strict and strong convexity, operations preserving convexity, and sets of global minima. The relationship between differentiability and convexity is explored through various definitions and theorems.

Login to watch the video

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructor

MATH-329: Continuous optimization

This course introduces students to continuous, nonlinear optimization. We study the theory of optimization with continuous variables (with full proofs), and we analyze and implement important algorith

Related concepts (71)

Related lectures (129)

Convex function

In mathematics, a real-valued function is called convex if the line segment between any two distinct points on the graph of the function lies above the graph between the two points. Equivalently, a function is convex if its epigraph (the set of points on or above the graph of the function) is a convex set. A twice-differentiable function of a single variable is convex if and only if its second derivative is nonnegative on its entire domain.

Linear map

In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism. If a linear map is a bijection then it is called a .

Linear span

In mathematics, the linear span (also called the linear hull or just span) of a set S of vectors (from a vector space), denoted span(S), is defined as the set of all linear combinations of the vectors in S. For example, two linearly independent vectors span a plane. The linear span can be characterized either as the intersection of all linear subspaces that contain S, or as the smallest subspace containing S. The linear span of a set of vectors is therefore a vector space itself. Spans can be generalized to matroids and modules.

Vector space

In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called vectors, may be added together and multiplied ("scaled") by numbers called scalars. Scalars are often real numbers, but can be complex numbers or, more generally, elements of any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called vector axioms. The terms real vector space and complex vector space are often used to specify the nature of the scalars: real coordinate space or complex coordinate space.

Kernel (linear algebra)

In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the linear subspace of the domain of the map which is mapped to the zero vector. That is, given a linear map L : V → W between two vector spaces V and W, the kernel of L is the vector space of all elements v of V such that L(v) = 0, where 0 denotes the zero vector in W, or more symbolically: The kernel of L is a linear subspace of the domain V.

Convex Functions

Covers the properties and operations of convex functions.

Geodesic Convexity: Theory and Applications

Explores geodesic convexity in metric spaces and its applications, discussing properties and the stability of inequalities.

KKT and Convex Optimization

Covers the KKT conditions and convex optimization, discussing constraint qualifications and tangent cones of convex sets.

Gradient Descent for Linear MSE

Explores Gradient Descent for Linear MSE in machine learning, covering computation, complexity, variants, Stochastic Gradient Descent, penalty functions, implementation issues, and non-convex optimization.

Convex Optimization: Gradient Descent

Explores VC dimension, gradient descent, convex sets, and Lipschitz functions in convex optimization.