Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
For people with limited mobility, navigating in cluttered indoor environment is challenging. In this work, we propose a mobile assistive furniture suite that is designed to ease the life of people with special needs in indoor movement. To enable intelligent coordination of this system, a key component is the localization of each mobile furniture. The challenge is to assess the state of an arbitrary living scenario so that the estimation can be used as a real-time feedback signal for autonomous closed-loop control of mobile furniture. We propose a perception pipeline that addresses these challenges. A machine learning model is designed and trained to jointly achieve multi-object semantic keypoint detection and classification in camera images. The synthetic data generation is employed to augment the training set and boost the model performance. A robust point cloud registration uses the detected semantic keypoints and depth information to estimate poses of the furniture. Tracking is applied to achieve smooth estimation. A high-performance accelerator that optimizes the efficiency of using heterogeneous devices is applied to achieve real-time performance. This visual perception pipeline is used in closed-loop control to steer the mobile furniture from initial to a desired location demonstrated in experiments on real hardware.
,