Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
The field of Unmanned Aerial Vehicles (UAVs), also known as drones, is rapidly growing, both in terms of size and of number of applications. Civil applications range from mapping, inspection, search and rescue, taking aerial footage, to art show, entertainment and more. Currently, most applications have a human pilot supervising or controlling the vehicles, but UAVs are expected to gain more autonomy with time. To fly in general airspace, used by both general and commercial aviation, a high level of autonomy is required from UAVs. A core functionality required to fly in general airspace is the UAVs' ability to detect and avoid collisions with other aircraft or objects. This functionality is handled by a so called Sense And Avoid (SAA) system. From among several sensors investigated to be used for a SAA system, a vision-based sensor is seen as a good candidate for a SAA system due to its ability to detect and identify a large variety of objects, as well as being close to the human's main mean to detect aircraft and other objects. To be as general as possible, this work focuses on non-cooperative algorithms that do not take assumptions on the motion of other aircraft. This thesis presents algorithms for a vision-based SAA system. It focuses on the relationship between sensing and avoidance, and how the limitations of one constrain the second. In particular, this thesis studies the consequences of the limited Field Of View (FOV) of a camera sensor on the collision avoidance algorithms. Given the assumptions above, the sensing and tracking of other UAVs is performed using cameras with fish-eye lenses that have a large enough FOV for the collision avoidance algorithms to guarantee to be collision-free. The detection of other UAVs is performed using two methods: a marker-based or a marker-less computer vision algorithms. Using the measurements from the computer vision algorithm, the positions and velocities of neighboring UAVs are tracked using a Gaussian mixture probability hypothesis density filter. This tracking algorithm is able to track multiple UAVs while requiring little computational resources, therefore representing a suitable candidate for on-board deployment. In this work, it is mathematically proven that the motion of an UAV has to be constrained according to the FOV of its sensor. Following that result, several collision avoidance algorithms are adapted to ensure collision-free navigation when used with a sensor with a limited FOV. Sensory limitations such as noise, lag, limited range and FOV, and their effects on the performance of collision avoidance algorithms are studied. Experimental work using high-fidelity simulation and real robots shows that algorithms that only use position information from the sensors are overall more reliable, although less efficient (in terms of distance traveled or trajectory smoothness) than algorithms that also use velocity estimates from the sensing system.
Martin Vetterli, Eric Bezzam, Matthieu Martin Jean-André Simeoni
Edoardo Charbon, Paul Mos, Mohit Gupta
, , , ,