Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimal Power Flow and Chebyshev Introduction
Graph Chatbot
Related lectures (24)
Previous
Page 2 of 3
Next
Gradient Descent: Principles and Applications
Covers gradient descent, its principles, applications, and convergence rates in optimization for machine learning.
Convex Optimization: Gradient Descent
Explores VC dimension, gradient descent, convex sets, and Lipschitz functions in convex optimization.
Trade-offs in Data and Time
Explores trade-offs between data and time in computational problems, emphasizing diminishing returns and continuous trade-offs.
Conditional Density and Expectation
Explores conditional density, expectations, and independence of random variables with practical examples.
Optimization Techniques: Stochastic Gradient Descent and Beyond
Discusses optimization techniques in machine learning, focusing on stochastic gradient descent and its applications in constrained and non-convex problems.
Elements of Statistics: Probability and Random Variables
Introduces key concepts in probability and random variables, covering statistics, distributions, and covariance.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.
Convex Sets: Mathematical Optimization
Introduces convex optimization, covering convex sets, solution concepts, and efficient numerical methods in mathematical optimization.
Variational Problems: Convexity and Coercivity
Explores variational problems, emphasizing convexity and coercivity conditions in functionals with integral side constraints.
Convex Optimization: Farkas' Lemma
Covers Farkas' lemma, exploring the relationship between linear programs and the conditions for its validity.