Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Recently, cutting-edge brain-machine interfaces (BMIs) have revealed the potential of decoders such as recurrent neural networks (RNNs) in predicting attempted handwriting [1] or speech [2], enabling rapid communication recovery after paralysis. However, current BMIs rely on benchtop configurations with resource-intensive computing units, leading to bulkiness and excessive power demands. For clinical translation, BMIs must be realized in the form of miniaturized, implantable systems and achieve high decoding accuracy in a variety of prosthetic tasks. To date, only a handful of systems have reported on-chip decoding for conventional BMI tasks such as finger movement [3–6]. These systems either solely implement specific decoder components on chip [3], consume significant power and area [4], utilize power-intensive commercial analog front-ends (AFEs) [5], or lack the high bandwidth necessary for more intricate BMI tasks [6]. There remains a gap for a high-channel-count, low-power BMI capable of simultaneous neural recording and motor decoding, especially for rapid restoration of intricate movements like handwriting. This paper presents a low-power, miniaturized BMI (MiBMI) chipset integrating a 192-ch broadband neural recording AFE, and a 512-ch 31-class activity-driven neural decoder utilizing low-dimensional distinctive neural codes (DNCs) for handwritten letter classification.
Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi