Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Human hands play a very important role in daily object manipulation. Current prosthetic hands are capable of mimicking most functions of the human hand, but how to interact with prosthetic hands based on human intentions remains an open problem. In this article, we proposed a wearable ultrasound-based interface to achieve simultaneous and proportional control(1) of wrist rotation (pronation/supination)(2) and hand grasp (open/close). A semisupervised learning framework integrating principal component analysis and sparse Gaussian process regression (SPGP) was proposed to simplify the cumbersome model calibration, which is a key issue that hinders the practical application of existing simultaneous and proportional prosthetic control approaches. The proposed algorithms were verified with both offline and online experiments on 12 able-bodied subjects. The offline analysis showed that the first principal component of ultrasound features (PC#1) was inherently linear to wrist rotations and the SPGP was able to establish the mapping between ultrasound features and hand grasp kinematics with fewer training data. The online target achievement control test showed that the proposed method can achieve accurate control of a virtual prosthesis, with motion completion rate of 97.61 +/- 4.67%, motion completion time of 4.66 +/- 0.91 s, and stability error of 10.99 +/- 1.69 degrees. This is the first study to achieve online simultaneous and proportional control of wrist and hand kinematics using ultrasound and semisupervised learning, paving the way for the era of muscle morphology-driven prosthetic control.
Aude Billard, Kunpeng Yao, Xiao Gao, Farshad Khadivar
, ,
Xingchen Yang, Yifan Liu, Ning Zhang