Phase-Change Memory Models for Deep Learning Training and Inference
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
The way our brain learns to disentangle complex signals into unambiguous concepts is fascinating but remains largely unknown. There is evidence, however, that hierarchical neural representations play a key role in the cortex. This thesis investigates biolo ...
Utilization of edge devices has exploded in the last decade, with such use cases as wearable devices, autonomous driving, and smart homes. As their ubiquity grows, so do expectations of their capabilities. Simultaneously, their formfactor and use cases lim ...
The success of deep learning may be attributed in large part to remarkable growth in the size and complexity of deep neural networks. However, present learning systems raise significant efficiency concerns and privacy: (1) currently, training systems are l ...
In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made to the development of neural network architectures. We focus on the importance of v ...
The relationship between simulated ion cyclotron emission (ICE) signals s and the corresponding 1D velocity distribution function f(upsilon(perpendicular to)) of the fast ions triggering the ICE is modeled using a two-layer deep neural network. The network ...
Highly data-centric AI workloads require that new computing paradigms be adopted because the performance of traditional CPU- and GPU-based systems are limited by data access and transfer. Training deep neural networks with millions of tunable parameters ta ...
Deep neural networks (DNNs) have revolutionized the field of artificial intelligence and have achieved unprecedented success in cognitive tasks such as image and speech recognition. Training of large DNNs, however, is computationally intensive and this has ...
Spiking neural networks (SNN) are computational models inspired by the brain's ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the conventio ...
Driven by massive amounts of data and important advances in computational resources, new deep learning systems have achieved outstanding results in a large spectrum of applications. Nevertheless, our current theoretical understanding on the mathematical fo ...
Classically, vision is seen as a cascade of local, feedforward computations. This framework has been tremendously successful, inspiring a wide range of ground-breaking findings in neuroscience and computer vision. Recently, feedforward Convolutional Neural ...