RoboDoc Returns — Talking Listening Drones and Swarming with Prof. Floreano
Last month we caught up with Prof. Dario Floreano, the head of the Laboratory of Intelligent Systems at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland. Here we continue our discussion, covering acoustic sensing, multi-drone operations and more.
Hello again Professor. Last time we spoke we discussed your work on visual perception, but you also mentioned acoustic research. So this means we’re talking about ‘listening drones’?
Yes, that’s right. This is particularly interesting in the area of drone swarming, whereby multiple robots need to coordinate their flights very carefully.
The first big problem with swarming is perceiving the other drones. If you have multiple drones in flight, each robot is only a very small spot in the other robots’ visual fields of view. They’re very hard to spot against the background. So we thought, what about sound detection?
The question is, how can a robot hear other robots when it is in flight? Well actually, this is a simple trick. You simply switch the engine off, very briefly, listen using three tiny microphones pointing in different directions, then restart the engine. Just like the eBee does when it is capturing aerial photos. Using this trick we developed drones that can detect the presence and relative direction of other drones acoustically.
Outside of the lab, what do you see are some of the potential end applications of your work?
Sticking with the acoustic approach, this could be useful in search and rescue, being used to detect people’s locations on the ground. If people are lost in snow or in a forest, they often have emergency whistles. We have shown that we can locate a person using a standard emergency whistle and broadcast its GPS coordinates to a rescue team.
With swarming, we’re also looking at using drones as relay networks for communication between rescue and security teams on the ground.
In the event of an earthquake, a fire or a landslide, the first thing an emergency team usually does is to set-up its communication infrastructure. This typically consists of a set of trucks with radio signal emitters and receivers, which operate on a specific frequency. It can take one or several hours to complete this set-up before rescuers can start working on the ground, using their receivers.
However drones can be equipped with emitters and receivers. Therefore a swarm of drones could cover the area, pick up the signals of the rescue team on the ground, and relay these, connecting people via very low power, short-range emitters. Currently with this technique we can connect multiple people, and not only with audio but with video signals too.
We’re also experimenting with multiple people moving around on the ground, with robots following their movement, and we’re considering operations that last longer than a single battery duration of around half an hour. We’re developing an algorithm that allows drones to land for a battery change, while new drones with a full battery launch to continue the service.
So on one hand your work is bio-inspired research, but you’re obviously also thinking about the potential real-world applications of the technology you develop. Which point do you start from—do you look at biology first and then consider how to use the tech that comes out of that research, or do real-world problems drive the direction your biological research takes?
It’s a mix. When it comes to drones, 15 years ago it was more blue sky research. These days however I have more projects on flying robots that could solve problems that people might face, not necessarily next year but in five years’ time. For example, if you will have lots of drones all in the air, you then think about safety. That means thinking about AI and sensing.
The specific applications are not always there though. For example, when the Gimbal was designed we didn’t have an application in mind, just a problem: how to design a robot that could withstand falls and fly again. We started by adding retractable legs, as insects have, but this wasn’t optimal, so we came up with a rolling cage around the drone. Then we realised the application: being able to go where other drones couldn’t go.
Prof. Floreano’s work at the Laboratory of Intelligent Systems (LIS) is bio-inspired; he and his team develop flying robots with animal characteristics, which they also use to learn more about animals.
In swarming there are lots of interesting problems to think about. In the future, as I mentioned, we’ll have a very high number of drones sharing our airspace. We’ll have a lot of crowding in areas such as landing spots and drone airways.
The big question is this: how will these drones avoid each other? Commercial GPS isn’t good enough to enable this avoidance when you fly very close to other drones. So the question is: what are the best sensors and the best control algorithms, both for drones flying in cooperative conditions—meaning the same types of drone flying in the same area—and non-cooperative conditions featuring drones from different manufacturers using different sensors? This is a real problem that some of our research is addressing, regarding the sensors and algorithms we’re developing.
Another real problem comes in transportation: how can we transport parcels very safely? I can’t talk about the solution just yet, but our work there builds on a lot of the topics we’ve been discussing.
Which seems a logical place to wrap up! Thank you so much for all these insights. How can readers who are interested in learning more follow your lab’s work?
The homepage of the Laboratory of Intelligent Systems is at lis.epfl.ch. That page includes our most recent publications. We also have a LIS YouTube channel that people can subscribe to and I am on Twitter at @DFloreano.
You are also involved with the RobotsPodcast, is that correct?
Well, I was involved in its creation, back when it was called Talking Robots, and I supported it from my lab for two years. The podcast grew and grew, and the students that were preparing it then set-up what is known today as the RobotsPodcast. Many members of LIS still work on this on a voluntary basis. It became so successful that it also spurned another, wider organisation, called Robohub.
Professor Floreano, thanks for your time.
You’re very welcome.
Watch Prof. Floreano’s ROBOTS14 presentation:
About Prof. Dario Floreano
Prof. Dario Floreano (M.A. 1988; M.S. 1992; Ph.D. 1995) is the director of the Laboratory of Intelligent Systems and director of the Swiss National Center of Competence in Robotics. He is co-founder of the company senseFly S.A., of the International Society for Artificial Life, and founder of the popular robotics podcast series Talking Robots (which later became RobotsPodcast).
He has been on the advisory board of the Future and Emerging Technologies division of the European Commission, vice-chair of the Global Agenda Council on Robotics and Smart Devices of the World Economic Forum, and on the board of governors of the International Society for Neural Networks. Prof. Floreano is a senior member of the IEEE and received several national and international awards for professional and cultural achievements. Prof. Floreano has published four books, more than 300 technical articles, and more than 10 patents in drone technologies.
Latest blog posts
Landslide mapping: assessing geological hazards using drones
This is a custom excerpt which isn't automatic and can be overriden.
Talking Drone Training & senseFly’s e-Learning Platform with Andrea Blindenbacher
With the launch of senseFly’s new e-learning platform and dedicated Certified senseFly Operator Program, Waypoint recently sat down with senseFly Global Head of Training Andrea Blindenbacher to learn more about how the new platform and certification course works, where to access it and how senseFly users (and even non-users) can benefit from the various self-guided tutorials...