Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Recently, cutting-edge brain-machine interfaces (BMIs) have revealed the potential of decoders such as recurrent neural networks (RNNs) in predicting attempted handwriting [1] or speech [2], enabling rapid communication recovery after paralysis. However, current BMIs rely on benchtop configurations with resource-intensive computing units, leading to bulkiness and excessive power demands. For clinical translation, BMIs must be realized in the form of miniaturized, implantable systems and achieve high decoding accuracy in a variety of prosthetic tasks. To date, only a handful of systems have reported on-chip decoding for conventional BMI tasks such as finger movement [3–6]. These systems either solely implement specific decoder components on chip [3], consume significant power and area [4], utilize power-intensive commercial analog front-ends (AFEs) [5], or lack the high bandwidth necessary for more intricate BMI tasks [6]. There remains a gap for a high-channel-count, low-power BMI capable of simultaneous neural recording and motor decoding, especially for rapid restoration of intricate movements like handwriting. This paper presents a low-power, miniaturized BMI (MiBMI) chipset integrating a 192-ch broadband neural recording AFE, and a 512-ch 31-class activity-driven neural decoder utilizing low-dimensional distinctive neural codes (DNCs) for handwritten letter classification.
, , ,