Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Polynomial Regression and Gradient Descent
Graph Chatbot
Related lectures (29)
Previous
Page 2 of 3
Next
Gradient Descent: Linear Regression
Covers the concept of gradient descent for linear regression, explaining the iterative process of updating parameters.
From Stochastic Gradient Descent to Non-Smooth Optimization
Covers stochastic optimization, sparsity, and non-smooth minimization via subgradient descent.
Cross-validation & Regularization
Explores polynomial curve fitting, kernel functions, and regularization techniques, emphasizing the importance of model complexity and overfitting.
Probabilistic Linear Regression
Explores probabilistic linear regression, covering joint and conditional probability, ridge regression, and overfitting mitigation.
Statistical Learning: Fundamentals
Introduces the fundamentals of statistical learning, covering supervised learning, decision theory, risk minimization, and overfitting.
Supervised Learning: Regression Methods
Explores supervised learning with a focus on regression methods, including model fitting, regularization, model selection, and performance evaluation.
Error Decomposition and Regression Methods
Covers error decomposition, polynomial regression, and K Nearest-Neighbors for flexible modeling and non-linear predictions.
Introduction to Machine Learning: Linear Models
Introduces linear models for supervised learning, covering overfitting, regularization, and kernels, with applications in machine learning tasks.
Optimization: Gradient Descent and Subgradients
Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.
Polynomial Regression: Overview
Covers polynomial regression, flexibility impact, and underfitting vs overfitting.