Piloting a Drone With Eye-Tracking Glasses

With the steady improvement of autonomous flight modes and safety features from DJI and futuristic computer vision from the likes of Skydio, you might think that the way we pilot drones is only heading one way: to true autonomy. Also known as not really needing to pilot them at all.

As exciting as that might be for businesses seeking to automate operations, it doesn’t sound that fun. Fortunately, the prospect of wall to wall autonomy isn’t stopping researchers from developing new and intriguing ways for pilots to get behind to controls.

Read more: New Developer Platform Could Make Skydio R1 The Go-To Commercial Drone

Flying a drone with eye-tracking glasses

A team of researchers from across the United States is working on technology that could allow drone pilots to control their aircraft with their eyes and a pair of gaze-tracking glasses.

The concept is outlined in the paper ‘Human Gaze-Driven Spatial Tasking of an Autonomous MAV’, written by Liangzhe Yuan, Christopher Reardon, Garrett Warnell, and Giuseppe Loianno, from the University of Pennsylvania, U.S. Army Research Laboratory, and New York University.

The navigation system used by the researchers’ drone is self-contained and, importantly, takes instructions based on the user’s orientation. Standard drone controls will move the aircraft in response to its own orientation – if you tell a drone to move right with the controls, it will move to its right.

So this new system is something of a feat: it’s able to understand both the location and orientation of the drone and its pilot. And it does so without needing an external motion-capture system or GPS.

This is all achieved with some readily available hardware: Tobii Pro’s Glasses 2 are used to track the eye movements of the pilot. There are tons of applications for eye-tracking tech; these same glasses are commonly used in advertising and market research, assessing the subconscious attention of potential customers.

The glasses are plugged into an NVIDIA Jetson TX2 CPU and GPU. A deep neural network takes the incoming images from the glasses, crunches the numbers and is able to calculate how far away the drone is based on its perceived size.

From there it’s just a case of gazing at your chosen location. The glasses will translate that data into a vector for the drone.

The research team has ambitions for their system to one day provide inexperienced pilots with an easy way to pick up the controls (glasses) and fly. We’ll be sure to keep an eye out for more eye-tracking developments in the drone space.

Malek Murison is a freelance writer and editor with a passion for tech trends and innovation. He handles product reviews, major releases and keeps an eye on the enthusiast market for DroneLife.
Email Malek



Check Also

DRONELIFE and DroneTalks Recap ADW 23

DRONELIFE and DroneTalks do a quick recap, from the floor of Amsterdam Drone Week. DRONELIFE …

Drones for Wind Turbine Inspection

ONYX Insight and Nearthlab Partner for Whole Turbine Predictive Maintenance by DRONELIFE Staff Writer Ian …