Hyperspectral cameras and UAS monitor coral bleaching

QUT

’s remote sensing and unmanned aerial vehicle (UAV) experts are partnering with the Australian Institute for Marine Science (AIMS) to test whether small drones, machine learning and specialised hyperspectral cameras can monitor the Great Barrier Reef more quickly, efficiently and in more detail than manned aircraft and satellite surveys.

QUT’s project leader Associate Professor Felipe Gonzalez said the team surveyed three reefs in the Great Barrier Reef Marine Park from 60 metres in the air while AIMS divers recorded precise levels of coral bleaching from under the water.

“By taking readings from the air and verifying them against the AIMS data from below the surface, we are teaching the system how to see and classify bleaching levels,” said Professor Gonzalez an aeronautical engineer from QUT’s Institute for Future Environments and Australian Centre for Robotic Vision.

“Flying 60 metres above the water gives us a spatial resolution of 9.2 centimetres per pixel, which we’ve found to be more than enough detail to detect and monitor individual corals and their level of bleaching.

“This is great news for us because low-altitude drones can cover far more area in a day than in-water surveys and they’re not hampered by cloud cover as manned aircraft and satellites are – a system like this has the real potential to boost the frequency of monitoring activities in an economical way.

“The more data scientists have at their fingertips during a bleaching event, the better they can address it. We see small drones with hyperspectral cameras acting as a rapid response tool for threatened reefs during and after coral bleaching events.”

Roughly the size of Japan, the Great Barrier Reef is home to around 3,000 reefs stretching 2,300 kilometres, making it slow and costly to survey using traditional methods.

Hyperspectral Camera

Key to the new aerial system is miniaturised hyperspectral cameras, which until recently were so large and expensive only satellites and manned aircraft could carry them.

Standard cameras record images in three bands of the visible spectrum – red, green and blue – mixing those bands together to create colours as humans see them.

Professor Gonzalez said the hyperspectral camera, by comparison, captures 270 bands in the visible and near-infrared portions of the spectrum, providing far more detail than the human eye can see and at an ultra-high resolution.

“You can’t just watch hyperspectral footage in the same way we can watch a video from a standard camera – we must process all the data to extract meaning from it,” Professor Gonzalez said.

“We’re building an artificial intelligence system that processes the data by identifying and categorising the different ‘hyperspectral fingerprints’ for objects within the footage.

“Every object gives off a unique hyperspectral signature, like a fingerprint. The signature for sand is different to the signature for coral and, likewise, brain coral is different to soft coral.

“More importantly, an individual coral colony will give off different hyperspectral signatures as its bleaching level changes, so we can potentially track those changes in individual corals over time.

“The more fingerprints in our database, the more accurate and effective the system.”

Professor Gonzalez is one of three QUT researchers speaking at this week’s World of Drones Congress in Brisbane, joining Professors Des Butler and Tristan Perez on the two-day program.

QUT’s drone and remote-sensing innovations will be on show at the congress’ accompanying expo, highlighting research advances in marine robots (COTSbot), agricultural robots (Ag Bot II), and in using UAVs to detect and monitor both pests and gas leaks.

Congress visitors will also come face-to-face with several of QUT’s research drones.

QUT is the principal academic sponsor of the World of Drones Congress, which runs 31 August to 2 September.

Source: Press Release

Leave a Reply

Your email address will not be published. Required fields are marked *