Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
A computer-implemented method and a distributed computer system (100) for privacy- preserving distributed training of a global neural network model on distributed datasets (DS1 to DSn). The system has a plurality of data providers (DP1 to DPn) being communicatively coupled. Each data provider has a respective local training dataset (DS1 to DSn) and a vector of output labels (OL1 to OLn) for training the global model. Further, it has a portion of a cryptographic distributed secret key (SK1 to SKn) and a corresponding collective cryptographic public key (CPK) of a multiparty fully homomorphic encryption scheme, with the weights of the global model being encrypted with the collective public key. Each data provider (DP1) computes and aggregates, for each layer of the global model, encrypted local gradients (LG1) using the respective local training dataset (DS1) and output labels (OL1), with forward pass and backpropagation using stochastic gradient descent. At least one data provider homomorphically combines at least a subset of the current local gradients of at least a subset of the data providers into combined local gradients, and updates the weights of the current global model (GM) based on the combined local gradients.
Volkan Cevher, Grigorios Chrysos, Fanghui Liu
Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi