Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Although there is increasing knowledge about how visual and tactile cues from the hands are integrated, little is known about how self-generated hand movements affect such multisensory integration. Visuo-tactile integration often occurs under highly dynamic conditions requiring sensorimotor updating. Here, we quantified visuo-tactile integration by measuring cross-modal congruency effects (CCEs) in different bimanual hand movement conditions with the use of a robotic platform. We found that classical CCEs also occurred during bimanual self-generated hand movements, and that such movements lowered the magnitude of visuo-tactile CCEs as compared to static conditions. Visuo-tactile integration, body ownership and the sense of agency were decreased by adding a temporal visuo-motor delay between hand movements and visual feedback. These data show that visual stimuli interfere less with the perception of tactile stimuli during movement than during static conditions, especially when decoupled from predictive motor information. The results suggest that current models of visuo-tactile integration need to be extended to account for multisensory integration in dynamic conditions.
Auke Ijspeert, Azhar Aulia Saputra
Olaf Blanke, Kim Do, Nathan Quentin Faivre, Manuel Mercier, Pierre Progin, Wenwen Chang
Olaf Blanke, Nathan Quentin Faivre, Giulio Rognini, Oliver Alan Kannape, Pavo Orepic