Teaching teams of drones to work alongside humans and nature

The Robust Adaptive Systems Lab just might have the highest ceilings of any laboratory on the Carnegie Mellon campus. It’s big and drafty, and were it not for the rows of computer workstations dotting the room, it might easily be mistaken for an airplane hangar. The centerpiece is a scaffolding structure surrounded on all four sides by netting that looks more like a batting cage than a serious piece of lab equipment at one of the world’s foremost robotics schools.

But when the demo starts, everyone stops and watches. Fifteen pocket-sized drones are lined up, three by five. All at once, bright lights flash beneath each, moments before suddenly lifting off in unison. The drones split into smaller groups and explore the boundaries of the cage, before converging again and moving around as one. The batting cage is a motion capture arena, designed to track the quadcopters’ choreographed movement with pinpoint accuracy.

The whole thing is mesmerizing. Like a real-world, robotic kaleidoscope, set to the soundtrack of dozens of tiny whirring propellers. It’s the final demo on a packed day’s trip to CMU’s Pittsburgh campus, and it may well be the most visually impressive. That’s by design. The team has been putting its research to use creating drone performances. One popped up in a recent music video by Norwegian electro-pop singer Aurora, and the team is working with CMU’s art department to employee them as part of an upcoming live theatrical performance.

Grad student, Ellen A. Cappo admits that the drone choreography was prepared in advance for the sake of our cameras, but the team is working on something far more complex. “I’m looking at examples where you don’t know ahead of time what you want to do,” she explains. “If you’re using the code base that you’re working on here, Lady Gaga could have gestured her hands and directed all the robots in real time [at the Super Bowl], as part of her performance in an improvisational manner.”

Cappo’s work centers on the relationship between humans and robots. The team is building complex systems capable of responding to user controls, while correcting for human errors and, hopefully, avoiding disaster before it occurs. Cappo’s flight planner is designed to do much of the heavy lifting on the backend, determining the best ways to execute commands without running afoul of the sorts of problems that can occur when one individual is attempting to pilot a number of systems all at once.

“We’re taking the intent, but dealing with all the low-level questions that the human user doesn’t want to think about, or that are difficult for the human user to think about,” explains Cappo. “Like making sure the motors don’t saturate, because the human told the robot to go somewhere more quickly than the robot is physically capable of.”

The system could one day help industries deploy drones in groups to perform inspections or assist with disaster recovery — the sorts of difficult, dirty and dangerous jobs analysts so often cite when discussing the importance of automation in the future of the job market. The system isn’t limited to drones, either. It could potentially be applied to any manner of robots in a group. But as it happens, the consumer drone market is playing a major role in making the technology accessible for school research groups.

“The hobbyist market has come really far in recent years,” explains Cappo. “So these devices are fairly affordable. But they have enough computational power on-board for something like this, where I want to study the interaction between multiple systems, I can buy something that is computationally advanced enough to execute the controls I need, but cheap enough for me to buy them in bulk.”

Labmate Vishnu Desaraju, a CMU PhD student, employees a larger variety of drone for his demo. His, too, takes place in the motion capture arena, but now the team fires up eight high-powered floor fans stacked atop one another in various configurations. It’s a makeshift system designed to mimic natural wind.

Like Cappo’s system, the second demo is designed to help drones avoid disaster in real-world settings, only instead of correcting human error, Desaraju’s work focuses on the chaotic impacts of nature on a poor-flying drone.

“There are a number of techniques that we can use to control the vehicles,” explains Desaraju. “But a lot of these assume that vehicles are operating in very nice conditions. So, how do we handle conditions that are not so nice, like really windy conditions or cases where you have a lot of uncertainty to account for. What I’m looking at is how we actually learn new control strategies and policies as we go, but do so in a rigorous manner so we don’t crash the vehicle as we’re trying to learn.”

The demo lacks the impressive visual scale of the one that preceded it, but there’s a fascinating subtle element visible as the drone slowly adapts to its surroundings. In the first demo, the drone wobbles, unsure of how to respond to its environment. But with each flight, it gets better, slowly adjusting to the external forces.

“We were looking at different flight and control strategies,” says Desaraju. “We realized that control was really the limiting factor, because we don’t have a good way to compensate for these types of disturbances. There are existing techniques that do try to address this, but they’re looking at one particular instance. No matter what sort of circumstance we’re working in, we want to be able to learn a custom set of behaviors for the scenario.”

Like Cappo’s demo, Desaraju’s system could offer an important level of support for those looking to employ drones in the workplace, from surveys to photographers. Or, for that matter, Lady Gaga.