Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimization Basics: Linear Algebra, Analysis, Convexity
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Equivalent norms: properties and proofs
Explores equivalent norms in a vector space and their continuity properties, including proofs of norm equivalence.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Gradient Descent Methods: Theory and Computation
Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.
Euclidean Spaces: Properties and Concepts
Covers the properties of Euclidean spaces, focusing on R^n and its applications in analysis.
Optimization Methods
Covers optimization methods without constraints, including gradient and line search in the quadratic case.
Gradient Descent: Principles and Applications
Covers gradient descent, its principles, applications, and convergence rates in optimization for machine learning.
Vectors and Norms: Introduction to Linear Algebra Concepts
Covers essential concepts of vectors, norms, and their properties in linear algebra.
Singular Values and Norms: Understanding Linear Maps with SVD
Explores singular value decomposition and its role in understanding linear maps.
Orthogonality and Least Squares Method
Explores orthogonality, dot product properties, vector norms, and angle definitions in vector spaces.