Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Quasi-Newton Methods
Graph Chatbot
Related lectures (26)
Previous
Page 1 of 3
Next
Coordinate Descent: Efficient Optimization Techniques
Covers coordinate descent, a method for optimizing functions by updating one coordinate at a time.
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Optimization Methods in Machine Learning
Explores optimization methods in machine learning, emphasizing gradients, costs, and computational efforts for efficient model training.
Newton's Method: Optimization Techniques
Explores optimization techniques like gradient descent, line search, and Newton's method for efficient problem-solving.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Convergence Criteria: Necessary Conditions
Explains necessary conditions for convergence in optimization problems.
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Root Finding Methods: Secant, Newton, and Fixed Point Iteration
Covers numerical methods for finding roots, including secant, Newton, and fixed point iteration techniques.
Mathematics of Data: Computation Role
Explores the role of computation in data mathematics, focusing on iterative methods, optimization, estimators, and descent principles.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.