Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Human hands play a very important role in daily object manipulation. Current prosthetic hands are capable of mimicking most functions of the human hand, but how to interact with prosthetic hands based on human intentions remains an open problem. In this article, we proposed a wearable ultrasound-based interface to achieve simultaneous and proportional control(1) of wrist rotation (pronation/supination)(2) and hand grasp (open/close). A semisupervised learning framework integrating principal component analysis and sparse Gaussian process regression (SPGP) was proposed to simplify the cumbersome model calibration, which is a key issue that hinders the practical application of existing simultaneous and proportional prosthetic control approaches. The proposed algorithms were verified with both offline and online experiments on 12 able-bodied subjects. The offline analysis showed that the first principal component of ultrasound features (PC#1) was inherently linear to wrist rotations and the SPGP was able to establish the mapping between ultrasound features and hand grasp kinematics with fewer training data. The online target achievement control test showed that the proposed method can achieve accurate control of a virtual prosthesis, with motion completion rate of 97.61 +/- 4.67%, motion completion time of 4.66 +/- 0.91 s, and stability error of 10.99 +/- 1.69 degrees. This is the first study to achieve online simultaneous and proportional control of wrist and hand kinematics using ultrasound and semisupervised learning, paving the way for the era of muscle morphology-driven prosthetic control.
Aude Billard, Kunpeng Yao, Xiao Gao, Farshad Khadivar
, ,
Xingchen Yang, Yifan Liu, Ning Zhang