US Air Force Considers Voice Control for UAS

The next decade could see a huge shift in the way armed unmanned aircraft and their human controllers interact, with potentially profound effects on future battlefields.

At the heart of this change: two-way voice controls for autonomous systems, just like your iPhone’s Siri app. Also, vibrating controls like an Xbox controller. A UAS operator could literally talk to an unmanned aircraft and the aircraft could talk right back, and also alert its human operator with a sensation similar to touch.

Today, human UAS operators rely on clunky interfaces composed of computer screens, keyboards and joysticks to steer their robot charges, which might be thousands of miles away from the virtual cockpits. The operator’s input is limited to keystrokes and mouse and joystick movements transmitted via satellite. The aircraft responds solely with streams of data or visual images sent from its onboard cameras. “It’s a desktop-type environment similar to an office,” explains Mike Patzek, a senior engineer working for the Air Force Research Laboratory in Ohio.

In the next decade or so, much more sophisticated controls — what the Air Force calls “man-machine interfaces” — could replace the desktops, Patzek tells Danger Room. In addition to the Siri-style two-way voice exchange, Patzek says the next-gen controls could include smarter, easier-to-interpret computer displays and tactile feedback from the drone to the operator, much in the way an Xbox controller vibrates to alert a player he’s taking damage within a game.

Imagine an Air Force UAS operator sitting in front of a single, large computer screen elegantly displaying select data from the distant robot in an intuitive graphical format — say, bits of information laid over a hyper-realistic three-dimensional moving picture stitched together from multiple visual and infrared sensors. The operator simply sits and watches until the robot literally asks for advice, perhaps on which suspicious objects — as determined by its sensors and algorithms — to check out more closely.

At that point the human ‘bot-wrangler states his recommendation and the aircraft swoops down to do its master’s bidding. If the robot detects incoming enemy gunfire, it alerts its boss by causing his chair to shake. The operator can call out, “Evasive action!” and the aircraft banks sharply.

“The fundamental issue is that the [robotic] systems are going to be more capable and have more automation,” says Mark Draper, a research psychologist for the Air Force Research Laboratory. “The trick is, how do you keep the human who is located in a different location understanding what that system is doing, monitoring and intervening when he or she needs to?”

Perhaps the same way people communicate with each other. By using touch … and their voices.

Photo Credit: US Air Force – Predator Ground Control Station

Source: Wired: Danger Room

Leave a Reply

Your email address will not be published. Required fields are marked *