Lecture

Optimization Problems: Gradient Descent Algorithm

Description

This lecture introduces a general formulation for optimization problems, which can be applied to linear regression, logistic regression, and support vector machines. By mapping these machine learning problems to this formulation, one can utilize the gradient descent algorithm to iteratively move in the opposite direction of the gradient until reaching the function's minimum.

In MOOC
IoT Systems and Industrial Applications with Design Thinking
The first MOOC to provide a comprehensive introduction to Internet of Things (IoT) including the fundamental business aspects needed to define IoT related products.
Instructors (2)
officia voluptate occaecat
Laboris do nostrud et labore et sint occaecat magna. Magna cillum sint qui aute excepteur ex ipsum irure et. Ullamco ea cupidatat veniam sit dolor dolore pariatur magna. Consequat mollit esse eiusmod sint dolore ad. Officia et reprehenderit sunt laboris consectetur nulla nostrud.
anim velit incididunt
Cillum excepteur nisi ea magna culpa ea duis duis ad ullamco ullamco. Est irure minim veniam et est anim incididunt minim adipisicing dolor nisi et. Magna ad ullamco aliqua laborum ullamco fugiat pariatur adipisicing veniam qui.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (32)
Logistic Regression: Vegetation Prediction
Explores logistic regression for predicting vegetation proportions in the Amazon region through remote sensing data analysis.
Feature Engineering: Polynomial Regression
Covers fitting linear regression on features of the original predictors for flexible feature representation.
Gradient Descent: Linear Regression
Covers the concept of gradient descent for linear regression, explaining the iterative process of updating parameters.
Optimization in Machine Learning: Gradient Descent
Covers optimization in machine learning, focusing on gradient descent for linear and logistic regression, stochastic gradient descent, and practical considerations.
Supervised Learning: Linear Regression
Covers supervised learning with a focus on linear regression, including topics like digit classification, spam detection, and wind speed prediction.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.