Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Objective function, Convexity
Graph Chatbot
Related lectures (27)
Previous
Page 2 of 3
Next
Convergence Criteria: Necessary Conditions
Explains necessary conditions for convergence in optimization problems.
Stochastic Gradient Descent: Non-convex Optimization Techniques
Discusses Stochastic Gradient Descent and its application in non-convex optimization, focusing on convergence rates and challenges in machine learning.
Optimization Problems
Explores optimization problems to find global extrema of functions and shapes.
Extrema of Functions
Covers the discussion of local extrema, concavity, convexity, and inflection points in functions.
Optimization Techniques: Gradient Descent and Convex Functions
Provides an overview of optimization techniques, focusing on gradient descent and properties of convex functions in machine learning.
Newton's local method: Geometric interpretation
Explores the geometric interpretation of Newton's method in optimization problems.
Optimization Methods: Convergence and Trade-offs
Covers optimization methods, convergence guarantees, trade-offs, and variance reduction techniques in numerical optimization.
Gradient Descent: Principles and Applications
Covers gradient descent, its principles, applications, and convergence rates in optimization for machine learning.
Energy optimization strategies
Covers brainstorming options for smart operation changes, heat recovery, and PV panel performance.
Faster Gradient Descent: Projected Optimization Techniques
Covers faster gradient descent methods and projected gradient descent for constrained optimization in machine learning.