Lecture

Optimization Trade-offs: Variance Reduction and Statistical Dimension

Description

This lecture covers the trade-offs between time and data in optimization, focusing on variance reduction techniques and the statistical dimension of convex cones. It discusses conditions for exact recovery in noiseless cases, statistical dimension properties, numerical results, and optimization algorithms like Gradient Descent and Stochastic Gradient Descent. The lecture also explores the impact of sample size on convergence rates and statistical errors, as well as techniques like Mini-batch SGD and SVRG for variance reduction. Convergence analysis and complexity considerations for different optimization methods are also presented.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.