Watch CBS News

Researchers developing mind-controlled drones

SAN ANTONIO -- With sensors covering his head, University of Texas at San Antonio graduate student Mauricio Merino concentrated hard. A camouflage-colored drone hovered with a soft hum in the middle of a campus research lab.

For now, though, it was fellow graduate student Prasanna Kolar who stood nearby to operate the unmanned aerial vehicle, also called a UAV, with a cellphone app - gently commanding it left and right.

The ultimate goal: Create a process to control the movements of groups of drones with nothing more than a thought, Daniel Pack, chairman of UTSA's electrical and computer engineering department, told the San Antonio Express-News.

The newly launched research comes at the intersection of two batches of funding. A team of researchers from the Unmanned Systems Laboratory in the university's electrical and computer engineering department recently scored a $300,000 contract from the Office of the Secretary of Defense to investigate how soldiers could use their brain signals to operate drones for intelligence, surveillance and reconnaissance missions.

A separate $400,000 Defense Department grant allowed the school to buy two high-performance electroencephalogram, or EEG, systems. These provide a noninvasive way to measure brain waves.

Six professors in various departments, including the drone researchers, will use the EEG systems for projects studying brain-machine interaction.

Increasing drone use raises privacy concerns 02:27

Pack said his research might help the Army lighten an already heavy load for soldiers in the field.

"It becomes more burdensome to ask them to carry more things," Pack said. "You have to have a computer or a mechanism that you use to control the UAVs. But if you can do this without having them actually carry additional equipment and using brainwaves directly, then you are helping our soldiers."

Pack envisions drone operators wearing EEG sensors in their helmets and giving commands far more complicated than a simple "move left" or "move right."

For instance, he wants a soldier in the field to be able to scout for enemies by commanding a group of drones to "go over the hill and see what's up there." Then the soldier might receive information back from the drones through something akin to Google Glass.

"Multiple UAVs will autonomously, amongst themselves, say, 'You do this. I do this.' And they will either take pictures of it or get a situational awareness" of what lies behind the hill, all because of a command from a single thought, Pack said.

People may have different brain waves for the same command, so researchers will have to "minimize the differences and maximize the similarities" between brain waves and come up with ways to interpret them into machine commands to make the concept work, Pack said.

UTSA electrical and computer engineering Professor Yufei Huang said the sensors that covered Merino's head measure magnetic waves generated by brain activity. When measured, the energy fields from the brain waves take on different shapes, which are unique to a person's brain status in many cases, Huang said. By analyzing the shape of the signal generated by the brain, researchers should be able to associate it with a particular brain activity, he said.

They plan to build computer algorithms to translate the brain signals into commands a drone can recognize, Huang said.

In general, drones are used for "dirty, dangerous or dull" activities, Pack said.

But applications using robots to interact with humans could also benefit from the research, for example in smart homes - picture a next-generation Roomba - or to help individuals with a disability.

"For people who don't have motor skills, for people in wheelchairs, this could be so helpful for them," Kolar said. He imagines a person able to move his or her wheelchair by that person's thoughts.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.