Available drone sensing solutions for position and range measurements as well as for collision avoidance are still far from perfect.
“Sense and avoid” for drones is a popular topic in the press right now, but the phrase can mean different things in different contexts and for different people. To clarify, there is a difference between solving the problem of “sense” and solving the problem of “avoid.” Also, there is a difference between “airborne collision avoidance” (which is what most concerns the FAA) and “obstacle avoidance” (which is the problem that most manufacturers are trying to solve right now).
With that in mind, this post looks at what a few manufacturers and software providers are doing to solve obstacle avoidance.
DJI – In June 2015, DJI announced Guidance, a combination of ultrasonic sensors and stereo cameras that allow the drone to detect objects up to 65 feet (20 meters) away. The Phantom 4 has front obstacle sensors combined with advanced computer vision and processing that allow it to react to and avoid obstacles in its path.
Intel – In June 2016, Intel announced the addition of a factory-installed Intel RealSense R200 camera and an Intel Atom processor module for Yuneec’s Typhoon H. The module will map the Typhoon H’s surroundings in 3D, which it then uses to autonomously navigate its environment—including rerouting itself around obstacles. Yuneec’s Typhoon H camera drone already had the ability to stop itself before colliding into large objects. But now it should avoid obstacles and keep moving right around them.
At IDF, Intel announced their Aero Ready-to-Fly Drone, a fully functional quadcopter powered by the Intel® Aero Compute Board, equipped with Intel® RealSense™depth and vision capabilities and running an open-source Linux operating system. Then, in September 2016, Intel acquired DJI’s VPU vendor Movidius, which means they may have the market cornered for sense-and-avoid technology.
Parrot – Parrot’s S.L.A.M.dunk integrates advanced software applications based on the robotic mapping construct called “simultaneous localization and mapping,” or SLAM. SLAM enables a drone to understand and map its surroundings in 3D and to localize itself in environments with multiple barriers and where GPS signals are not available. In other words, it performs obstacle avoidance. Their solution depends on active sensors. You can read more here.
Neurala – Neurala is a software solution that analyzes the images from off-the-shelf cameras to enhance drone navigation. It uses GPU-based hardware running artificial intelligence neural network software. Full collision avoidance is still under development.
LeddarTech – Leddar just announced its modular Vu8, a compact solid-state LiDAR sensor that detects targets at a range of up to 705 feet (or 215 meters). The Vu8 is an active sensor that “could be” used for collision avoidance, navigation, and as an altimeter for drones. There are some cool details in this video, but no real-life use on a drone demo just yet.
At this time, the drone industry appears to be rich with R&D and solutions that attempt to tackle the obstacle avoidance problem. I like what LeddarTech says:
Available drones sensing solutions for position and range measurements as well as for collision avoidance are still far from perfect: GPSs and barometers aren’t full-proof—even outdoors—and can’t be relied upon when navigating indoors. Ultrasonic altimeters have very limited range. Optical flow sensors require good lighting and textured surfaces, and camera vision are still a work in progress and tend to be processing-intensive.