But development is vibrant, and you’ll see it work first in prosumer drones
QuickTake
THE FACTS: “Sense and avoid” for drones is a popular topic in the press right now, but the phrase can mean different things in different contexts and people. To clarify, there is a difference between solving the problem of “sense” and solving the problem of “avoid.” Also, there is a difference between “airborne collision avoidance” (which is what most concerns the FAA) and “obstacle avoidance” (which is the problem that most manufacturers are trying to solve right now). With that in mind, this post looks at what a few manufacturers and software providers are doing to solve obstacle avoidance.
WHAT’S COOL AND WHAT’S NOT: target=”_blank”> target=”_blank”>DJI – DJI was one of the first to release a drone that could sense and avoid obstacles. In June 2015, they announced Guidance, a combination of ultrasonic sensors and stereo cameras that allow the drone to detect objects up to 65 feet (20 meters) away and stay away from objects at a preconfigured distance. The kit was immediately available for the Matrice 100 drone development platform. They subsequently incorporated that technology into their flagship Phantom 4 prosumer drone but not their new professional drone, the Matrice 600.
The Phantom 4 has front obstacle sensors combined with advanced computer vision and processing that allow it to react to and avoid obstacles in its path. In the “TapFly Mode” of the flight control program, the Phantom 4 obstacle sensing systems are supposed to enable you to fly a path with the drone automatically moving around objects as it flies. But you can find several real-world tests like this one that show it’s not a perfect system.
Intel – Intel is all over sense and avoid, and they accomplish it with active sensors. In 2015 at the Consumer Electronics Show (CES), they gave this sneak peek at what they were working on. In January 2016, they acquired German drone manufacturer Ascending Technologies (AscTec) and dazzled CES with an on-stage demo of their Intel® RealSense™ technology integrated into an AscTec drone that showcased how it can avoid obstacles and continue to follow the subject. They recently announced their Aero Ready-to-Fly Drone, a fully functional quadcopter powered by the Intel® Aero Compute Board, equipped with Intel® RealSense™ depth and vision capabilities and running an open-source Linux operating system. It is geared for developers, researchers, and UAV enthusiasts.
It’s clear Intel understands the importance of sense and avoid technology for ready-to-fly prosumer and commercial drones, too. In June 2016, Intel announced the addition of a factory-installed Intel RealSense R200 camera and an Intel Atom processor module for Yuneec’s Typhoon H. The module will map the Typhoon H’s surroundings in 3D, which it then uses to autonomously navigate its environment—including rerouting itself around obstacles. Yuneec’s Typhoon H camera drone already had the ability to stop itself before colliding into large objects. But now it should avoid obstacles and keep moving right around them. We’ll see if that comes true in the real world. Let’s hope it does. Otherwise Intel’s $60 million investment in Yuneec may show signs of not delivering the expected return.
Either way, Intel has hedged its bets. In July 2016, a team from Intel and Airbus demonstrated an aircraft visual inspection with a modified AscTec Falcon 8 with RealSense cameras. The demo took place during this week’s Farnborough International Airshow in England on an Airbus passenger airliner.
Parrot – Parrot’s S.L.A.M.dunk integrates advanced software applications based on the robotic mapping construct called “simultaneous localization and mapping,” or SLAM. The name of Parrot’s solution is a play on the words “slam dunk,” but really it’s anything but that. SLAM is a computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent’s location within it. Parrot’s use of SLAM enables a drone to understand and map its surroundings in 3D and to localize itself in environments with multiple barriers and where GPS signals are not available. In other words, it performs obstacle avoidance. Their solution depends on active sensors. You can read more here.
Neurala – Neurala is a software solution that analyzes the images from off-the-shelf cameras to enhance drone navigation. Unlike Parrot’s solution, Nueurla technology is passive. It uses GPU-based hardware running artificial intelligence neural network software. While commercial-grade GPS can fly a drone close to its objectives, Neurala software can help it identify safe areas to travel and land. At InterDrone, Neurala announced the launch of Bots Software Development Kit. The kit will allow manufacturers to install artificial-intelligence “neural” software directly into their applications without the need for additional hardware. That said, full collision avoidance is still under development.
LeddarTech – Leddar just announced its modular Vu8. The specs make it ideal for autonomous drone use. The Vu8 is a compact solid-state LiDAR sensor that detects targets at a range of up to 705 feet (or 215 meters) and weighs 75 grams. The Vu8 is an active sensor that “could be” used for collision avoidance, navigation, and as an altimeter for drones. According to LeddarTech, the Vu8 LiDAR is “immune to ambient light” and was designed to provide “highly accurate multi-target detection over eight independent segments.” There are some cool details in this video but no real-life use on a drone demo just yet.
BOTTOM LINE: At this time, the drone industry appears to be rich with R&D and solutions that attempt to tackle the obstacle avoidance problem. But a simple search on YouTube for successful real-world examples reveals we still have a way to go before anyone claims victory. I like what LeddarTech says:
Available drones sensing solutions for position and range measurements as well as for collision avoidance are still far from perfect: GPSs and barometers aren’t full-proof—even outdoors—and can’t be relied upon when navigating indoors. Ultrasonic altimeters have very limited range. Optical flow sensors require good lighting and textured surfaces, and camera vision are still a work in progress and tend to be processing-intensive.
As with any technology, there are always trade-offs. It’s still not clear to me who has the category-killing solution. I think that’s going to take more R&D investment. One thing is for sure—we’ll see more new sense-and-avoid product and tech announcements this year. Like with DJI, I believe it will continue to be released first in prosumer drones because that’s the only place where sales volumes and margins are strong enough to recoup the investment.