UAS already beam thousands of hours of video to intelligence analysts manning multiple screens and send information directly to soldiers in the field. This dramatic growth is leading to information overload.
It won’t stop there: the military wants individual soldiers not only to drive tanks, watch for enemies and listen for orders, but also someday to command roving bands of robot scouts. Multitasking could lead to a mental meltdown on future battlefields where ever-fewer humans control a growing swarm of UAS and robots.
“Throughout the Department of Defense, there’s a trend to invert the existing human-to-robot ratio so that a single operator is managing several vehicles,” said Raja Parasuraman, a psychologist at George Mason University in Fairfax, Va. “That adds exponentially to the load on an operator if each platform has multiple sensors and video feeds on it, all beaming information to the operator.”
Military scientists and outside researchers have begun tackling the problem. The military has begun testing the human brain’s multitasking limits, so that it can create technological aids or readjust expectations for what its warriors can do.
One answer may come from smarter computer programs. Some could screen surveillance footage for human or vehicle targets. Others might help tank gunners scan the horizon for targets or issue navigation commands to future robot swarms.
“Rather than have a human operator look at everything, a system might look at video or images, figure out which are low-priority, and leave high-priority targets for humans to look at,” Parasuraman told InnovationNewsDaily. “It’s like having a smart junior assistant.” Parasuraman heads a neuroergonomics center funded by the Air Force, where he uses computer simulations to test the limits of human multitasking on the battlefield. A few lucky “cognitive superstars” don’t suffer from any problem in multitasking, Parasuraman said, but most people don’t perform as efficiently when jumping back and forth between different duties.
Soldiers have either a high or low ability to control their attention, according to the U.S. Army Research Laboratory. Researchers ran simulations where soldiers had to talk with commanding officers or neighbouring units, monitor intelligence coming from a military network, and scan their own immediate surroundings for enemies.
If the military can identify people who better handle the frenzied multitasking of modern warfare, it could someday handpick the best performers for certain jobs. For now, better understanding of multitasking can help create systems that work for the average soldier.
“Regardless of individual human factors, you want to make systems usable by any soldier,” said Jessie Chen, a research psychologist at U.S. Army Research Laboratory. The Army lab also developed its own software, called IMPRINT, to model the information overload faced by soldiers in their daily duties. That has helped overhaul Army expectations of how much soldiers can do at any given time.
Even defence contractors have begun to take such human limits into account early on when designing new military vehicles, weapons or other systems — a step that could save on the costs of having to make fixes later in development.
“What we’re really trying to do in our work is to make the soldier ready and reliable in the sense that we’re not overloading them,” said Diane Mitchell, an engineering psychologist at the U.S. Army Research Laboratory. “We need to keep them able to do their mission with manageable workloads.”
Future automated assistants could share the workload and allow human soldiers to focus on their main mission. Chen and her colleagues simulated two such cases: first, a gunner” scenario in which an automated system assisted a human operator in finding targets for a tank’s main gun; second, a “RoboLeader” scenario in which a computer programme helped a human operator control a swarm of robots.
The Army researchers added a twist by having the automated assistants make different mistakes. They found that soldiers less skilled at multitasking doggedly relied upon the error-prone automated systems, even when it hurt their performance.
The best multitasking soldiers made a different mistake. Once they realised the automated system made mistakes, they tended to ignore it entirely during the “gunner” scenario — even in cases where they might have done better with the error-prone system’s help.
Still, soldiers more willingly used automated help in the “RoboLeader” scenario, because the user interface let them quickly double-check whether “RoboLeader” was right or wrong. That provided a valuable lesson on how to design future automated systems.
Military analysts who sift through video and images collected by UAS may soon get automated help. A system funded by the Office of Naval Research can automatically detect and track moving targets such as enemy vehicles as well as individual soldiers or insurgents.
The automated technology can use sensor data from either manned or unmanned aircraft, said Dale Linne von Berg, head of the applied optics branch at the U.S. Naval Research Laboratory in Washington, D.C. His lab has successfully tested the system and is looking to deploy it.
“The anticipated scenarios using this technology really boil down to reducing operators’ workloads, reducing the time it takes for them to exploit targets, and allowing our forces to find and track targets that they may not have previously been able to prosecute,” Linne von Berg said in an e-mail.
Automated helpers will need to get better even as UAS do more by themselves. George Mason’s Parasuraman has already put Air Force pilots through simulations where they manage five or six drones while getting information from 30 or 40 different sources in chat windows.
“The Air Force is interested in scaling up that problem in the simulations,” Parasuraman said. “If we have 100 operators and thousands of UAS and a network of many hundreds of computer systems and people sending info, what’s the breaking point?”
Source: MSNBC