Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We develop rigorous alternating direction optimization methods for a prototype constrained convex optimization template, which has broad applications in computational sciences. We build upon our earlier work on the model-based gap reduction (MGR) technique, which revolves around a smoothed estimate of the primal-dual gap. MGR allows us to simultaneously update a sequence of primal and dual variables as well as primal and dual smoothness parameters so that the smoothed gap function converges to the true gap, which in turn converges to zero -- both at optimal rates. In contrast, this paper introduces a new split-gap reduction (SGR) technique as a natural counterpart of MGR in order to take advantage of additional splitting structures present in the prototype template. We illustrate SGR technique using the forward-backward and Douglas-Rachford splittings on the smoothed gap function and derive new alternating direction methods. The new methods obtain optimal convergence rates without heuristics and eliminate the infamous penalty parameter tuning issue in the existing alternating direction methods. Finally, we verify the performance of our methods in comparison to the existing state-of-the-art and the new theoretical performance bounds via numerical examples.
Anja Skrivervik, Mingxiang Gao, Jakub Liska