Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
The present application concerns a computer-implemented method for training a machine learning model in a distributed fashion, using Stochastic Gradient Descent, SGD, wherein the method is performed by a first computer in a distributed computing environmen ...
2019
, , ,
Shared objects are the means by which processes gather and exchange information about the state of a distributed system. Objects that disclose more information about the system are therefore more desirable. In this paper, we propose the schedule reconstruc ...
2017
, , ,
We report on \emph{Krum}, the first \emph{provably} Byzantine-tolerant aggregation rule for distributed Stochastic Gradient Descent (SGD). Krum guarantees the convergence of SGD even in a distributed setting where (asymptotically) up to half of the workers ...
ACM2017
, , ,
Shared objects are the means by which processes gather and exchange information about the state of a distributed system. Objects that disclose more information about the system—and thus provide a more centralized view—are therefore more desirable. In this ...