Software Transactional Memory on Relaxed Memory Models
Related publications (32)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Modern data management systems aim to provide both cutting-edge functionality and hardware efficiency. With the advent of AI-driven data processing and the post-Moore Law era, traditional memory-bound scale-up data management operations face scalability ch ...
We consider the problem of training a neural network to store a set of patterns with maximal noise robustness. A solution, in terms of optimal weights and state update rules, is derived by training each individual neuron to perform either kernel classifica ...
Short-term synaptic plasticity and modulations of the presynaptic vesicle release rate are key components of many working memory models. At the same time, an increasing number of studies suggests a potential role of astrocytes in modulating higher cognitiv ...
The hardware complexity of modern machines makes the design of adequate programming models crucial for jointly ensuring performance, portability, and productivity in high-performance computing (HPC). Sequential task-based programming models paired with adv ...
Recent GPU architectures make available numbers of parallel processing units that exceed by orders of magnitude the ones offered by CPU architectures. Whereas programs written using dataflow programming languages are well suited for programming heterogeneo ...
The formation and storage of memories has been under deep investigation for several decades. Nevertheless, the precise contribution of each brain region involved in this process and the interplay between them across memory consolidation is still largely de ...
Model compression techniques have lead to a reduction of size and number of computations of Deep Learning models. However, techniques such as pruning mostly lack of a real co-optimization with hardware platforms. For instance, implementing unstructured pru ...
In this thesis, we propose model order reduction techniques for high-dimensional PDEs that preserve structures of the original problems and develop a closure modeling framework leveraging the Mori-Zwanzig formalism and recurrent neural networks. Since high ...
Learning and memory rely on synaptic communication in which intracellular signals are transported to the nucleus to stimulate transcriptional activation. Memory induced transcriptional increases are accompanied by alterations to the epigenetic landscape an ...
Utilization of edge devices has exploded in the last decade, with such use cases as wearable devices, autonomous driving, and smart homes. As their ubiquity grows, so do expectations of their capabilities. Simultaneously, their formfactor and use cases lim ...