engineers are mimicking the neural networks of birds to create artificially intelligent drones that could bolster our air defences.
A fighter jet approaching enemy territory opens the hold to release a dozen little drones. They’re small and nimble – switching from fixed wing to mini helicopter-style rotors as they swarm around their target, while their host fighter jet and pilot stand a safe distance away.
After disabling the hostile anti-aircraft missile systems by jamming their radars, the drones reassemble like quoits caught on a pole to be collected safely, ready for their next mission and the fighter jet flies on.
Like little baby birds, the drones have learned to fly by trial and error, trying out their fledgling skills in a safe environment, improving as they go, says Professor Nick Colosimo, one of BAE Systems’ lead technologists. As with many good ideas to do with artificial intelligence and machine learning, he says, inspiration has come from nature – mimicking the neural networks of birds.
Of course you won’t see these drones in action yet, but the technology – adaptive learning – is almost there, he says. Within the next decade, armed forces could be using adaptable unmanned aerial vehicles (UAVs), which can switch between fixed and rotary wing, and are smart enough to tackle sophisticated air defences. It’s just a question of perfecting the application.
“Around the world, defence companies are exploring the art of the possible with this technology,” says Prof Colosimo. Smart drones might be used to detect the edge of a poison gas cloud and relay this back to a human commander – or counter some other threat in the future we don’t even know about yet.
“This would allow combat aircraft to get as close as possible to a threatening environment. Frankly if you lose all the drones, it wouldn’t matter – the crew would stay safe,” he continues.
Fresh ideas
These two ideas – hybrid drones capable of learning and the neat way of gathering them up again – was the result of collaborating with autonomous systems postgraduates at Cranfield University, who created the relevant algorithms. BAE Systems already has strategic partnerships with five UK universities, partly because the defence company appreciates a fresh view.
“Students bring new ideas that we, as seasoned engineers, might not even think of,” explains Prof Colosimo. “They’ve grown up with digital assistants such as Siri and bring a different way of looking [at things]. Having that diversity of thought in the future will be critical.”
How much we are comfortable leaving in the hands of smart or automated systems is up for debate, says Prof Colosimo: “Our view is that autonomy is there as an aid, but not a means of control.”
There’s no conflict in simply using UAVs to spot and relay information about threats back to a human commander, for instance – this might be tanks amassing on a border, or adversaries building new and sophisticated weapons. “Here we’re talking about teaming manned and unmanned technology – trying to help the human decision makers rather than taking away their decision-making power.”
But the challenge will be that this same technology may also be deployed by adversaries. “They may field highly autonomous systems, including weaponised commercial technology, where every decision is delegated to a machine – and that’s a concern,” he says.
However, the real challenge with autonomous systems is no longer technological: “It comes down to what’s acceptable, legally, ethically. How can we preserve our values while having the defence capabilities to thwart our adversaries?”
Source: The Telegraph