Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
In a team of autonomous drones, individual knowledge about the relative location of teammates is essential. Existing relative positioning solutions for teams of small drones mostly rely on external systems such as motion tracking cameras or GPS satellites that might not always be accessible. In this letter, we describe an onboard solution to measure the 3-D relative direction between drones using sound as the main source of information. First, we describe a method to measure the directions of other robots from perceiving their engine sounds in the absence of self-engine noise. We then extend the method to use active acoustic signaling to obtain the relative directions in the presence of self-engine noise, to increase the detection range, and to discriminate the identity of robots. Methods are evaluated in real world experiments and a fully autonomous leader-following behavior is illustrated with two drones using the proposed system.
Jan Skaloud, Gabriel François Laupré
Amir Mohsen Ahmadi Najafabadi, Abdulkadir Uzun