Navigation auf uzh.ch
To fly safely, drones need to know their precise position and orientation in space at all times. While commercial drones solve this problem using GPS, this only works outdoors, and is not very reliable, especially in urban environments. Furthermore, the conventional cameras mounted on drones work only when there is a high amount of light available, and the drone’s speed has to be limited otherwise the resulting image is motion-blurred and cannot be used by computer vision algorithms. To solve this problem, professional drones use sensors that are elaborate, expensive, and bulky, such as laser scanners.
A group of researchers from the University of Zurich and the Swiss research consortium NCCR Robotics has now developed an innovative alternative approach, enabling drones to fly in a wide range of conditions using an eye-inspired camera that can easily cope with high-speed motion. It can even see in the dark much more effectively than the conventional cameras currently used by all commercial drones. “This research is the first of its kind in the fields of artificial intelligence and robotics, and will soon enable drones to fly autonomously and faster than ever, including in low-light environments,” says Prof. Davide Scaramuzza, Director of the Robotics and Perception Group at UZH. He and his team have already taught drones to use their onboard cameras to infer their position and orientation in space.
Event cameras, which were invented at UZH together with ETH Zurich, do not need to capture full light on the entire bio-inspired retina in order to have a clear picture. Unlike their conventional counterparts, they only report changes in brightness for each pixel, ensuring perfectly sharp vision even during fast motion or in low-light environments. The UZH researchers have also designed new software able to efficiently process the output from such cameras, harnessing this to enable autonomous flight at higher speeds and in lower light than currently possible with commercial drones.
Drones equipped with an event camera and the software designed by the Swiss researchers could assist search and rescue teams in scenarios where conventional drones would be of no use – for example on missions at dusk or dawn or when there is too little light for normal cameras to work. They would also be able to fly faster in disaster areas, where time is critical in saving survivors.
“There is still a lot of work to be done before these drones can be deployed in the real world since the event camera used for our research is an early prototype. We have yet to prove that our software also works reliably outdoors,” says PhD Student Henri Rebecq. And Professor Scaramuzza adds: “We think this is achievable, however, and our recent work has already demonstrated that combining a standard camera with an event-based camera improves the accuracy and reliability of the system.”
Antoni Rosinol Vidal, Henri Rebecq, Timo Horstschaefer, Davide Scaramuzza. Hybrid, Frame and Event-based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors. IEEE Robotics and Automation Letters, September 19, 2017.