Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
In visual-based robot navigation, panoramic vision emerges as a very attractive candidate for solving the localization task. Unfortunately, current systems rely on specific feature selection processes that do not cover the requirements of general purpose robots. In order to fulfill new requirements of robot versatility and robustness to environmental changes, we propose in this paper to perform the feature selection of a panoramic vision system by means of the saliency-based model of visual attention, a model known for its universality. The first part of the paper describes a localization system combining panoramic vision and visual attention. The second part presents a series of indoor localization experiments using panoramic vision and attention guided feature detection. The results show the feasibility of the approach and illustrate some of its capabilities.
Andreas Krause, Majed El Helou
,