Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We consider the problem of finding a saddle point for the convex-concave objective , where is a convex function with locally Lipschitz gradient and is convex and possibly non-smooth. We propose an adaptive version of the Condat-Vũ algorithm, which alternates between primal gradient steps and dual proximal steps. The method achieves stepsize adaptivity through a simple rule involving and the norm of recently computed gradients of . Under standard assumptions, we prove an ergodic convergence rate. Furthermore, when is also locally strongly convex and has full row rank we show that our method converges with a linear rate. Numerical experiments are provided for illustrating the practical performance of the algorithm.