Investigators at the Institute of Systems Optimization (ITE) at the Karlsruhe Institute of Technology (KIT) in Germany have been working on a promising approach that does not use GNSS.
The initial premise of the ITE approach is that for future autonomous flight, especially in the potentially difficult indoor environment of search and rescue (SAR) such as in a building fire, GNSS signal reception may be little to none. But most UAVs are equipped with GNSS and inertial, so aiding the inertial solution with a back-up system is preferred. ITE chose to use a monocular camera and a 2D laser rangefinder combined into a hybrid laser-camera sensor for navigation aiding.
The camera and laser-range finder were initially calibrated by focusing from multiple different adjacent locations on one object, and so determining the attitude and translation between the two sensors. Basic navigation sans GNSS is established using the acceleration and angular rate information provided by the IMU, but inertial drift rapidly decreases accuracy, so aiding is essential.
The aiding solution has several components which are first integrated together. The camera sensor provides an initial “keyframe” from which relative motion can be derived.
Using the initial keyframe, subsequent images provide estimated motion relative to the keyframe.
The next phase was to verify the initial performance of the inertial/hybrid solution, by flying the UAV down a corridor towards a wall. Horizontal position began to degrade around 67 seconds.
Corridor test.
The next more challenging demonstration involved transit down the corridor then into an adjacent room and leaving via a different exit. In addition, solutions using hybrid aiding and laser scanning aiding were evaluated.
Corridor-room test.
The hybrid approach appeared to satisfy the anticipated test constraints very accurately with a deviation of about 0.8 during the 274 second flight, while the laser scanning approach had a horizontal error between start and end point of about 3.7. It was felt that the structured environment in the test rooms presented challenges for laser scanning and resulted in vertical variations coming from the dependence on the UAV’s attitude, while the hybrid solution overcame these problems.
The conclusion from the testing was that the hybrid sensor performance was not limited by the structured test environment. So missions in more challenging environments could be better navigated in future with the hybrid system, compared to those where existing laser-scan-matching approaches would be used. The researchers intend to now focus on better perception of the test environment. For exploration missions, not only is accurate positioning crucial but also an accurate representation of the environment is necessary, for which the hybrid sensor is a promising tool.