Optimizing Loss Functions: Gradient Descent Variants
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.
Covers optimization in machine learning, focusing on gradient descent for linear and logistic regression, stochastic gradient descent, and practical considerations.
Explores adversarial machine learning, covering the generation of adversarial examples, robustness challenges, and techniques like Fast Gradient Sign Method.