Microsoft is offering a way to teach an autonomous drone to fly itself by putting a drone’s control software through trials millions of times before the first takeoff. The cloud-based simulation platform, Project AirSim, is being made available in limited preview.
Josh Riedy knew it wasn’t real – that he wasn’t actually hovering near the top of a wind turbine in North Dakota, hundreds of feet off the ground. But it didn’t matter when he looked down. His stomach still dropped as if he were on a rollercoaster.
The CEO of Airtonomy was inside a digital replica of a real wind farm at the time, trailing a simulated drone through virtual reality glasses as it inspected towering turbines. His North Dakota-based company has been using these hyper-realistic simulations to train autonomous aerial vehicles that are now inspecting wind farms, surveying wildlife and detecting leaks in oil tanks across the Midwest.
Every one of those AI-powered flights first happened countless times in simulated 3D worlds. Because if AI is the key to building autonomy in the air, data is the key to building AI – data that is impossible to get in the real world, Riedy said.
“You don’t want to fly drones into wind turbines, powerlines or really anything for that matter,” Riedy said. “Coupled with the fact that winter can literally last 7 months in North Dakota, we realized we needed something other than the physical world to design our solutions for customers.”
The answer was Microsoft’s Project AirSim, announced at the Farnborough International Airshow. Project AirSim is a new platform running on Microsoft Azure to safely build, train and test autonomous aircraft through high-fidelity simulation.
In these realistic environments, AI models can run through millions of flights in seconds, learning how to react to countless variables much like they would in the physical world: How would the vehicle fly in rain, sleet or snow? How would strong winds or high temperatures affect battery life? Can the drone’s camera see a turbine’s arms on an overcast day just as well as a clear one?
Project AirSim uses the power of Azure to generate massive amounts of data for training AI models on exactly which actions to take at each phase of flight, from takeoff to cruising to landing. It will also offer libraries of simulated 3D environments representing diverse urban and rural landscapes as well as a suite of sophisticated pretrained AI models to help accelerate autonomy in aerial infrastructure inspection, last-mile delivery and urban air mobility.
Project AirSim is available today in limited preview. Interested customers can contact the Project AirSim team to learn more.
It arrives as advances in AI, computing and sensor technology are beginning to transform how we move people and goods, said Gurdeep Pall, Microsoft corporate vice president for Business Incubations in Technology & Research. And this isn’t just happening in remote areas home to wind farms; with urban density on the rise, gridlocked roads and highways simply can’t cut it as the quickest way to get from Point A to Point B. Instead, businesses will look to the skies and autonomous aircraft.
“Autonomous systems will transform many industries and enable many aerial scenarios, from the last-mile delivery of goods in congested cities to the inspection of downed power lines from 1,000 miles away,” Pall said. “But first we must safely train these systems in a realistic, virtualized world. Project AirSim is a critical tool that lets us bridge the world of bits and the world of atoms, and it shows the power of the industrial metaverse – the virtual worlds where businesses will build, test and hone solutions and then bring them into the real world.”
Accelerating aerial autonomy
High-fidelity simulation was at the heart of AirSim, an earlier open-source project from Microsoft Research that is being retired but inspired today’s launch. AirSim was a popular research tool, but it required deep expertise in coding and machine learning.
Now, Microsoft has transformed that open-source tool into an end-to-end platform that allows Advanced Aerial Mobility (AAM) customers to more easily test and train AI-powered aircraft in simulated 3D environments.
“Everyone talks about AI, but very few companies are capable of building it at scale,” said Balinder Malhi, engineering lead for Project AirSim. “We created Project AirSim with the key capabilities we believe will help democratize and accelerate aerial autonomy – namely, the ability to accurately simulate the real world, capture and process massive amounts of data and encode autonomy without the need for deep expertise in AI.”
With Project AirSim, developers will be able to access pretrained AI building blocks, including advanced models for detecting and avoiding obstacles and executing precision landings. These out-of-the-box capabilities eliminate the need for deep machine learning expertise, helping expand the universe of people who can start training autonomous aircraft, Malhi said.
Airtonomy, which participated in an early access program for Project AirSim, used it to help customers launch remote inspections of critical infrastructure quickly and safely, without the time, expense and risk of sending a crew to remote locations – and without deep technical backgrounds.
“We create autonomous capture routines for the frontline worker – people who don’t use drones and robots on a regular basis but need them to act like any other tool within their service vehicle,” Riedy said. With Airtonomy, not only does the drone inspect the asset automatically, the captured data is automatically contextualized at the moment of capture. These features can be extended to any asset in any industry, enabling novel and automated end-to-end workflows.
“It’s amazing to see those responsible for our nation’s infrastructure use these tools literally with a push of a button and have a digital representation at their fingertips for things like outages, disaster response or route maintenance. Project AirSim is transforming how robotics and AI can be used in an applied fashion,” he said.
Using data from Bing Maps and other providers, Project AirSim customers will also be able to create millions of detailed 3D environments and also access a library of specific locations, like New York City or London, or generic spaces, like an airport.
Microsoft is also working closely with industry partners to extend accurate simulation to weather, physics and – crucially – the sensors an autonomous machine uses to “see” the world. A collaboration with Ansys leverages their high-fidelity physics-based sensor simulations to enable customers with rich ground truth information for autonomous vehicles. Meanwhile, Microsoft and MathWorks are working together so customers can bring their own physics models to the AirSim platform using Simulink.
As simulated flights occur, huge volumes of data get generated. Developers capture all that data and use it to train AI models through various machine learning methods.
Gathering this data is impossible to do in the real world, where you can’t afford to make millions of mistakes, said Matt Holvey, director of intelligent systems at Bell, which also participated in Project AirSim’s early access program. Often, you can’t afford to make one.
Given that, Bell is turning to Project AirSim to hone the ability of its drones to land autonomously. It’s a tough problem. What if the landing pad is covered in snow, or leaves or standing water? Will the aircraft be able to recognize it? What if the rotor blades kick up dust, obscuring the vehicle’s vision? AirSim let Bell train its AI model on thousands of ‘what if’ scenarios in a matter of minutes, helping it practice and perfect a critical maneuver before attempting it in the real world.
The future of autonomous flight
The emerging world of advanced aerial mobility will launch a diverse cast of vehicles into the skies, from hobbyist drones to sophisticated eVTOLs (electric vertical take-off and landing) aircraft carrying passengers. And the potential use cases are almost limitless, Microsoft says: inspecting powerlines and ports, ferrying packages and people in crowded cities, operating deep inside cramped mines or high above farmlands.
But technology alone won’t usher in the world of autonomous flight. The industry must also chart a pathway through the world’s airspace monitoring systems and regulatory environments. The Project AirSim team is actively engaged with standards bodies, civil aviation and regulatory agencies in shaping the necessary standards and means of compliance to accelerate this industry.
Microsoft also plans to work with global civil aviation regulators on how Project AirSim might help with the certification of safe autonomous systems, Pall said, potentially creating scenarios inside AirSim that an autonomous vehicle must successfully navigate. In one case there is blinding rain, in one case deep winds, in one case it loses GPS connectivity. If the vehicle can still get from Point A to Point B every time, Pall said, that could be an important step toward certification.
And the industry is moving closer toward scalable commercial operation of autonomous systems. Bell recently used Project AirSim to prepare for NASA’s Systems Integration and Operationalization (SIO) extension project, which aims to accelerate the use of autonomous aircraft in the national airspace system. The company’s Autonomous Pod Transport (APT) aircraft flew through a corridor in the Dallas-Fort Worth area, successfully demonstrating the craft’s ability to maintain contact with ground-based radar monitoring systems.
“AirSim allowed us to get a true understanding of what to expect before we flew in the real world,” Bell’s Holvey said. “It’s going to be one of the tools that will accelerate the timeline for scaling aerial mobility. If we have to test and validate everything by hand, or in a physical lab, or on a flying aircraft, we’re talking about decades, and it’s going to cost billions. But Project AirSim pulls that forward through high-fidelity simulation.”
Ashish Kapoor, the creator of the of the original AirSim in Microsoft Research, is proud to have helped the simulation engine evolve from a research tool that existed largely in code into a more robust platform that any business can use without a deep technical background. An aviator himself, Kapoor can’t wait to see what that means for the world of flight.
“A ton of data gets generated when an aircraft flies through space in Project AirSim,” said Kapoor, now general manager of Microsoft’s autonomous systems research group. “Our ability to capture that data and translate it into autonomy is going to significantly change the landscape of aviation. And because of that we are going to see many more vehicles in the sky, helping to monitor farms, inspect critical infrastructure and transport goods and people to the remotest of places.”
Source: Press Release