Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We consider decentralized stochastic optimization with the objective function (e.g. data samples for machine learning task) being distributed over n machines that can only communicate to their neighbors on a fixed communication graph. To reduce the communication bottleneck, the nodes compress (e.g. quantize or sparsify) their model updates. We cover both unbiased and biased compression operators with quality denoted by \omega 0. This is (up to our knowledge) the first gossip algorithm that supports arbitrary compressed messages for \omega > 0 and still exhibits linear convergence. We (iii) show in experiments that both of our algorithms do outperform the respective state-of-the-art baselines and CHOCO-SGD can reduce communication by at least two orders of magnitudes.
Touradj Ebrahimi, Davi Nachtigall Lazzarotto, Bowen Huang
Touradj Ebrahimi, Michela Testolina, Davi Nachtigall Lazzarotto