Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Myoelectric prostheses allow users to recover lost functionality by controlling a robotic device with their remaining muscle activity. Commercial devices usually use a two-recording-channel system placed on specific muscles to control a single degree of freedom, i.e. one sEMG channel for flexion and one for extension. While intuitive, the system provides little dexterity. The goal of my thesis is to develop more effective decoding and control algorithms based on the use of wearable systems to record "middle density" EMG (~60 channels) from the subject forearm to decode finger movement. Combining deep learning and transfer learning, this approach can also increase performances while reducing training time. However, even with the best decoding algorithm, amputees are still unable to conform their fingers to the shape of an object. This, in turn, inhibits their ability to secure and adapt their grasp according to the requirement of the task. Adding robotic automation can fill this void by sharing the control between user and robotic automation stabilizing the grasp, making fine adjustments to the fingers by processing information from the tactile sensors placed on prosthetic hand's fingers. In the next months, one amputee subject will receive a nerve stimulator fully implanted in order to obtain sensory feedback in his robotic hand for a period of 6 months. Sensory feedback is an advantage for a number of tasks but it was tested with simple EMG decoders. We aim at quantifying the clinical value for an amputee of proportional single finger EMG decoding combined with shared control and sensory feedback while reducing training time and simplifying everyday use.
, ,