Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Objective function, Gradient and descent
Graph Chatbot
Related lectures (27)
Previous
Page 2 of 3
Next
Stochastic Gradient Descent: Non-convex Optimization Techniques
Discusses Stochastic Gradient Descent and its application in non-convex optimization, focusing on convergence rates and challenges in machine learning.
Faster Gradient Descent: Projected Optimization Techniques
Covers faster gradient descent methods and projected gradient descent for constrained optimization in machine learning.
Optimization Basics: Unconstrained Optimization and Gradient Descent
Covers optimization basics, including unconstrained optimization and gradient descent methods for finding optimal solutions.
Gradient Descent: Principles and Applications
Covers gradient descent, its principles, applications, and convergence rates in optimization for machine learning.
Quasi-newton optimization
Covers gradient line search methods and optimization techniques with an emphasis on Wolfe conditions and positive definiteness.
Unconstrained Optimization Theory
Explores unconstrained optimization theory, covering global and local minima, convexity, and gradient concepts.
Stochastic Gradient Descent
Explores stochastic gradient descent optimization and the Mean-Field Method in neural networks.
Optimization Techniques: Gradient Descent and Convex Functions
Provides an overview of optimization techniques, focusing on gradient descent and properties of convex functions in machine learning.
Optimization methods
Covers optimization methods, focusing on gradient methods and line search techniques.
Proximal and Subgradient Descent: Optimization Techniques
Discusses proximal and subgradient descent methods for optimization in machine learning.