Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Stochastic Gradient Descent: Optimization and Convergence Analysis
Graph Chatbot
Related lectures (28)
Previous
Page 3 of 3
Next
Stochastic Gradient Descent: Optimization and Convergence
Explores stochastic gradient descent, covering convergence rates, acceleration, and practical applications in optimization problems.
Iterative Methods for Nonlinear Equations
Explores iterative methods for solving nonlinear equations, discussing convergence properties and implementation details.
Newton's Method: Convergence Analysis
Explores the convergence analysis of Newton's method for solving nonlinear equations, discussing linear and quadratic convergence properties.
Optimisation in Energy Systems
Explores optimization in energy system modeling, covering decision variables, objective functions, and different strategies with their pros and cons.
Convergence in Law: Theorem and Proof
Explores convergence in law for random variables, including Kolmogorov's theorem and proofs based on probability lemmas.
Generalized Integrals and Convergence Criteria
Covers generalized integrals, convergence criteria, series convergence, and harmonic series in analysis.
Subsequences and Bolzano-Weierstrass Theorem
Covers the proof of the Squeeze Theorem, Quotient Criteria, and the Bolzano-Weierstrass Theorem.
Geometric Series: Convergence and Limit
Explores the convergence and limit of geometric series, demonstrating mathematical properties and applications.