With Department of Defense funding, Ashraf Qadir and colleagues at the University of North Dakota in Grand Forks have built their own image-processing machine, which is small and light enough to be carried by a small UAS.
They say their device is capable of tracking objects such as cars and houses in real time without the need for number crunching on the ground.
The way these guys have solved this problem is to simplify it and then solve the simplified puzzle. They point out that from a plane, objects on the ground such as cars and houses do not generally change shape.
However, they do change their orientation and position relative to the camera. So their object-tracking programme essentially solves just these two problems. First, it uses the motion of the object in the previous frames to predict where it is going to be in the next frame. That’s fairly straightforward.
Second, it uses a remarkably simple process to follow the object as it rotates. When the onboard computer first finds its target, it uses a simple image-processing programme to create a set of images in which the object is rotated by 10 degrees. That produces a library of 36 pictures showing the object in every orientation.
So the process of following the target is simply a question of matching it to one of those images. Qadir and co have developed a simple protocol to optimize this process.
They’ve tested the approach both in the lab and in the air using a Linux computer on a single printed circuit board plus a small camera and gimballing system. All this is carried on board the university’s customised UAS called Super Hauler with a wingspan of 350 centimeters and payload capability of 11 kilograms.
The system worked well in tests. The UAS has an air-to-ground video link that allows an operator to select a target such as a car, building, or in these tests, the group’s control tent. The onboard computer then locks onto the target, generates the image library and begins tracking.
From an altitude of 200 meters or so, Qadir and co say the system works well at frame rates over 25 frames per second–that’s essentially real time..
The system has some limitations. Following a single vehicle is obviously much harder than selecting and following one of many in traffic, for example. Similarly, station keeping over a single tent in a field is relatively straightforward compared to the same problem in suburbia where all the houses look the same.
These guys have a proof-of-principle device that could easily be deployed cheaply and more widely. The Super Hauler isn’t quite in the ‘toy’ department yet but it isn’t hard to imagine how a version of this kind of software and hardware could be deployed in cheap UAS elsewhere in the near future.
Top Photo: UND Super Hauler #2 with designer, builder, and test pilot Bruce Tharpe. Credit: Bruce Tharpe Egineering
Source: MIT Technology Review