Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Machine learning and data processing algorithms have been thriving in finding ways of processing and classifying information by exploiting the hidden trends of large datasets. Although these emerging computational methods have become successful in today's world, their mathematical nature resides in largely using matrix-vector multiplications (MVM). Traditional von-Neumann architecture struggles to efficiently compute these operations due to the separation of its computing and storage units. As an alternative, by performing arithmetic or logic operations in the memories' physical layer, in-memory computing has been showing energy benefits while performing data-centric applications since it avoids the communication bottleneck. Whilst many material platforms have been tested for developing this emerging architecture, none have been able to meet all the requirements for its realization. Nevertheless, two-dimensional (2D) materials have not yet been analyzed in this context. Their extraordinary electrical and mechanical performances motivate the study of using this class of materials as a promising platform for realizing in-memory processors. In this regard, this thesis aims to demonstrate memory devices and nanosystems based on monolayer MoS2 for targeting in-memory architectures. The first part of this work consists of fabricating floating-gate memories and analyzing their behavior face to the device's imperfections and advantages. This section's results highlight the potential performance regarding their use as scalable analog memories and memristors devices. Simulations from fabricated devices verify the effect of interface traps on the device's performance. These findings consolidate the use of monolayer MoS2 for scaling memory devices and the exploration of MoS2 memories for creating computational systems. In the second part, small-scale systems are designed using previously developed memories. Since in-memory systems can be seen either from logic or analog computation perspectives, this part explores each system type independently. On the one hand, logic-in-memory systems were developed in the context of logic, allowing the creation of nonvolatile re-configurable logic with a reduced footprint. On the other hand, for analog computing, the memory elements were studied as computational elements in a one-layer artificial neural network, also called perceptron. In conclusion, simulation estimates that MoS2 memories could outperform the energy efficiency of their silicon counterparts in-memory processors by extrapolating the experimental findings to larger arrays. The findings of the last part motivate the final section of this work to focus on the development of large-system integration of 2D material-based memories. This led to the further development of fabrication methods and design methodologies, allowing the production of large-scale systems in a reliable and scalable manner. The first main finding consists of the co-integration of memory elements with analog, digital, and mixed-signal circuits, a necessary step towards the development of interfaces in the context of in-memory computing. Secondly, a 1024-device MVM processor demonstrates a functional in-memory computing core capable of performing discrete signal processing. Altogether, this thesis concludes that MoS2 is a promising avenue for enhancing the performance of floating-gate memories and for the development of the next generation of in-memory processors.
Joshua Alexander Harrison Klein
David Atienza Alonso, Marina Zapater Sancho, Luis Maria Costero Valero, Darong Huang, Qunyou Liu