Publication

Multimodal Interaction Management for Tour-Guide Robots Using Bayesian Networks

2003
Article de conférence
Résumé

In this paper, we propose a Bayesian network framework for managing interactivity between a tour-guide robot and visitors in mass exhibition conditions, through robust interpretation of multi-modal signals. We report on methods and experiments interpreting speech and laser scanner signals in the spoken dialogue management system of the autonomous tour-guide robot RoboX, successfully deployed at the Swiss National Exhibition (Expo.02). A correct interpretation of a user’s (visitor’s) goal or intention at each dialogue state is a key issue for successful speech-based interaction in voice-enabled communication between robots and visitors. We introduce a Bayesian network approach for combining noisy speech recognition results with noise-independent data from a laser scanner, in order to infer the visitors’ goal under the uncertainty intrinsic to these two modalities. We demonstrate the effectiveness of the approach by simulation based on real observations during experiments with the tour-guide robot RoboX at Expo.02.

À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.