In Manado, Indonesia scientists are using Artificial Intelligence (AI) to assess reef health after global warming sparked bleaching that killed a large number of the planet’s shallow-water tropical corals have found surprisingly healthy reefs off Indonesia.
Marine scientists from The University of Queensland, Australia produced and analyzed more than 56,000 images in an area known as the Coral Triangle around the island of Sulawesi during a six-week expedition.
Underwater scooters fitted with 360-degree cameras allowed researchers to photograph up to 2 km/1.5 miles in a single dive. Artificial intelligence then analysed the images much faster than human scientists could.
The expedition, funded by Paul G. Allen Philanthropies, aimed to evaluate how global-warming-induced coral bleaching between 2014 and 2017 had affected the Coral Triangle.
Researchers found that reefs that had experienced little impact had bounced back or were in better shape than when they were originally surveyed in 2014. The findings can help plan how best to target coral restoration programs elsewhere.
“After several depressing years as a coral reef scientist, witnessing the worst-ever global coral bleaching event, it is unbelievably encouraging to experience reefs such as these” said Dr. Emma Kennedy, the British scientist who led the team of researchers from the U.K., U.S., Australia, Indonesia and Trinidad.
“It means we still have time to save some coral reefs through the science-based targeting of conservation action”
Coral reefs support roughly a quarter of all ocean life and provide over 500 million people with food and income, contributing about $375 billion annually to the global economy.
They are extremely vulnerable to temperature changes because the ocean’s upper layers absorb more than 90% of the heat generated by carbon emissions, which has devastated reefs.
At the current rate at which CO2 is accumulating in the atmosphere, most coral reefs are not predicted to survive past 2050.
“Paul Allen believes that through data, technology and science, we can solve some of the world’s most intractable challenges” said Art Min, vice president for impact with Paul G. Allen Philanthropies. “The data gleaned from this survey will help us better understand coral resiliency and inform critical conservation efforts. It’s a sign of hope for coral reefs and the ecosystems that depend on them”.
If reefs that are less vulnerable can be protected from other stresses, such as plastic pollution and overfishing, until ocean temperatures stabilise, they could rapidly replenish surrounding reefs that have been more affected by climate change in a domino-like effect.
The future of coral reefs depends on finding reefs “that are most likely to survive until global warming is brought under control,” said Professor Ove Hoegh-Guldberg, a University of Queensland professor and chief scientist of the initiative. “Technology is now allowing us to do just this, it is very exciting.”
In a related project, the expedition team has been using the latest satellite data and climate-change predictions to map vulnerability across the planet, identifying areas where coral reefs may be less exposed to heat stress and storms.
Scientists are limited by how long they can physically stay underwater, and photography has already helped by giving them the time to analyze images of reefs back at the lab. Now, AI image recognition is accelerating the painstakingly slow process of identifying and cataloging coral reef data.
“The use of AI to rapidly analyze photographs of coral has vastly improved the efficiency of what we do — what would take a coral reef scientist 10 to 15 minutes now takes the machine a few seconds” Dr. Kennedy said. “It means we can start scaling up from studying reefs at the meter scale to looking at patterns of coral communities at the kilometer scale.”
The recognition software uses a form of Deep Learning AI to detect patterns in large amounts of data. It uses algorithms and its own judgment after a period of “supervised learning,” in which scientists show it how to recognize corals, groups of algae and other invertebrates from increasingly complex contours and textures.
“The machine learns in a similar way to a human brain, weighing up lots of minute decisions about what it’s looking at until it builds up a picture and is confident about making an identification,” Dr. Kennedy said.
The program is usually able to perform well after it has been shown between 400 and 600 photos. Then the learning stops and it can process images on its own.
The software is being used to assess more than 56,000 images taken during the expedition, which ended in June, comparing them to images taken of the same reefs during the 2014 Coral Triangle survey that was part of the XL Catlin Seaview Survey led by The Ocean Agency and The University of Queensland.
Initial observations show that there appears to be little to no deterioration of the corals in the 3,851-square-kilometer (1,487-square-mile) assessment area.
The team is also starting to use cloud-based analysis to auto-generate comparison reports, dramatically reducing the cost of monitoring while expanding the scale at which measurements can be made. Full reviewed results from the science team are expected later this year.
The expedition program was conducted during the International Year of the Reef 2018, declared by the International Coral Reef Initiative in collaboration with UN Environment and supported by The Tiffany & Co. Foundation.
More info here.