Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
A key aspect of constructing highly scalable Deep-learning microelectronic systems is to implement fault tolerance in the learning sequence. Error-injection analyses for memory is performed using a custom hardware model implementing parallelized restricted Boltzmann machines (RBMs). It is confirmed that the RBMs in Deep Belief Networks (DBNs) provides remarkable robustness against memory errors. Fine-tuning has significant effects on recovery of accuracy for static errors injected to the structural data of RBMs during and after learning, which are either at cell-level or block level. The memory-error tolerance is observable using our hardware networks with fine-graded memory distribution.
Josephine Anna Eleanor Hughes, Kai Christian Junge
Barbara Bruno, Jauwairia Nasir