Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Rough terrain robotics is a fast evolving field of research and a lot of effort is deployed towards enabling a greater level of autonomy for outdoor vehicles. Such robots find their application in scientific exploration of hostile environments like deserts, volcanoes, in the Antarctic or on other planets. They are also of high interest for search and rescue operations after natural or artificial disasters. The challenges to bring autonomy to all terrain rovers are wide. In particular, it requires the development of systems capable of reliably navigate with only partial information of the environment, with limited perception and locomotion capabilities. Amongst all the required functionalities, locomotion and position tracking are among the most critical. Indeed, the robot is not able to fulfill its task if an inappropriate locomotion concept and control is used, and global path planning fails if the rover loses track of its position. This thesis addresses both aspects, a) efficient locomotion and b) position tracking in rough terrain. The Autonomous System Lab developed an off-road rover (Shrimp) showing excellent climbing capabilities and surpassing most of the existing similar designs. Such an exceptional climbing performance enables an extension in the range of possible areas a robot could explore. In order to further improve the climbing capabilities and the locomotion efficiency, a control method minimizing wheel slip has been developed in this thesis. Unlike other control strategies, the proposed method does not require the use of soil models. Independence from these models is very significant because the ability to operate on different types of soils is the main requirement for exploration missions. Moreover, our approach can be adapted to any kind of wheeled rover and the processing power needed remains relatively low, which makes online computation feasible. In rough terrain, the problem of tracking the robot's position is tedious because of the excessive variation of the ground. Further, the field of view can vary significantly between two data acquisition cycles. In this thesis, a method for probabilistically combining different types of sensors to produce a robust motion estimation for an all-terrain rover is presented. The proposed sensor fusion scheme is flexible in that it can easily accommodate any number of sensors, of any kind. In order to test the algorithm, we have chosen to use the following sensory inputs for the experiments: 3D-Odometry, inertial measurement unit (accelerometers, gyros) and visual odometry. The 3D-Odometry has been specially developed in the framework of this research. Because it accounts for ground slope discontinuities and the rover kinematics, this technique results in a reasonably precise 3D motion estimate in rough terrain. The experiments provided excellent results and proved that the use of complementary sensors increases the robustness and accuracy of the pose estimate. In particular, this work distinguishes itself from other similar research projects in the following ways: the sensor fusion is performed with more than two sensor types and sensor fusion is applied a) in rough terrain and b) to track the real 3D pose of the rover. Another result of this work is the design of a high-performance platform for conducting further research. In particular, the rover is equipped with two computers, a stereovision module, an omnidirectional vision system, an inertial measurement unit, numerous sensors and actuators and electronics for power management. Further, a set of powerful tools has been developed to speed up the process of debugging algorithms and analyzing data stored during the experiments. Finally, the modularity and portability of the system enables easy adaptation of new actuators and sensors. All these characteristics speed up the research in this field.
, , ,
,