Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We address the problem of learning a classifier from distributed data over a number of arbitrarily connected machines without exchange of the datapoints. Our purpose is to train a neural network at each machine as if the entire dataset was locally available. This is accomplished by taking advantage of the so called consensus algorithm for scalar values distributed over a network. We describe an abstract framework for consensus learning and we derive a distributed version for the multilayer feed forward neural networks with back-propagation and early stopping. Tests are performed and results show that with a careful selection of parameters our method performs like the non-distributed. The trade-off is that the total computational effort over all machines is larger.
The capabilities of deep learning systems have advanced much faster than our ability to understand them. Whilst the gains from deep neural networks (DNNs) are significant, they are accompanied by a growing risk and gravity of a bad outcome. This is tr ...
Volkan Cevher, Grigorios Chrysos, Fanghui Liu, Zhenyu Zhu
Wulfram Gerstner, Stanislaw Andrzej Wozniak, Ana Stanojevic, Giovanni Cherubini, Angeliki Pantazi