Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Given a sequence of functions with , finite-sum minimization seeks a point minimizing . In this work, we propose a key twist into the finite-sum minimization, dubbed as \textit{continual finite-sum minimization}, that asks for a sequence of points such that each minimizes the prefix-sum . Assuming that each prefix-sum is strongly convex, we develop a first-order continual stochastic variance reduction gradient method ({\small \sc{CSVRG}}) producing an -optimal sequence with overall \textit{first-order oracles} (FO). An FO corresponds to the computation of a single gradient at a given for some . Our approach significantly improves upon the FOs that requires and the FOs that state-of-the-art variance reduction methods such as require. We also prove that there is no natural first-order method with gradient complexity for , establishing that the first-order complexity of our method is nearly tight.
Daniel Kuhn, Andreas Krause, Yifan Hu, Jie Wang
Alfio Quarteroni, Andrea Manzoni