A new system collects and interprets neuro-physiological and performance measures to evaluate new warfighter technologies.
The warfighter’s eyes are locked intently on the remotely piloted aircraft display, but perhaps too much so as the eye-tracking software indicates other screens are overlooked. Tasked with managing several remotely piloted aircraft, at what point did the controller become overloaded? Was it a surge of simultaneous demands, the design of the display interfaces, or perhaps a combination of both?
These are the questions being answered in the Air Force Research Laboratory’s Human Universal Measurement and Assessment Network (HUMAN) Laboratory. The facility, equipped with state-of-the-art sensors to monitor brain, heart, muscle-activity, eye movement, respiration, galvanic skin response and other body signals, can produce a comprehensive picture of the factors that may affect an operator’s mental state. By collecting and analyzing the neuro-physiological and performance data of warfighters during simulation, the HUMAN Laboratory can identify and assess the precise moments and combination of stressors that can overload or impede performance in missions.
Understanding those limits is critical as unmanned and remotely piloted aircraft platforms and missions continue to evolve; missions could range from a single RPA controlled by a team to multiple RPAs controlled by an individual. Data on the operator’s state, such as cognitive overload, can help illuminate and mitigate performance bottlenecks. This understanding can provide opportunities to implement performance augmentations or mitigations to alleviate any bottlenecks in current or future systems, and could lead to the improved design of the next generation of control stations, such as the RPA’s MQ-X.
Aptima, which applies expertise on how humans think, learn, and perform, is supporting AFRL with its A-Measure software, which captures human performance in live, virtual, and constructive environments. In the HUMAN Lab, A-Measure collects and combines the neuro-physiological sensor data and operator performance measures from the simulation to create a detailed picture of the operator’s experience throughout the demands of a mission.
“Researchers can ask pilots about their workload limits, yet those (under-reported) answers can differ considerably from what the brain and body tell us,” said Scott Galster, PhD, Chief of the Neuro-Inspired Adaptive Aiding Section, Applied Neuroscience Branch, Air Force Research Laboratory (711 HPW/RHCP). “By integrating these sensing technologies with other system measures, we get a much better picture of how the human is responding.”
Aptima’s three-year contract with the HUMAN Lab will help AFRL test and refine approaches for assessing warfighter states, such as workload, tease out bottlenecks in performance, and improve the design of information and weapons systems. Fed back into the product development cycle, these assessments will help improve newly emerging or conceptual technologies that optimize warfighter performance in critical settings.
A-Measure in Action
Using the latest ground control station software in the HUMAN Lab, the test subject is asked to operate several RPAs. With increasing mission demands, at some point, their performance will begin to decline and they won’t be able to manage an additional aircraft. To prevent possible catastrophic mistakes, it’s critical to anticipate any performance decline before they happen. “You wouldn’t gain that understanding without the neuro-physiological data. You cannot fake physiological signals,” added AFRL’s Galster.
A-Measure collects mission performance measures and the raw data from brain and body sensors while the subjects are engaged in simulation, including their eye movement, brain waves, heart rate, skin response, and respiration. The data is interpreted and aligned over time to provide a real-time view of the operator’s state as the mission unfolds.
“EEGs (electroencephalograms) have been used in research for 50-60 years, but they only tell part of the story. By adding in these other sensors and measures we gain a more robust, context dependent picture of what the operator is experiencing, and at what points they are being mentally overloaded,” Galster said.
For example, eye-tracking may show a controller being too focused on a single display to the exclusion of other sources of information. This may lead to designing displays or interfaces with cues that prevent attentional tunneling, or alarms or alerting cues that activate when operators experience performance decrements.
The efforts of the HUMAN Lab will ultimately enable Warfighters to optimize performance and successfully accomplish their missions by alleviating cognitive workload.
Source: Press Release