A joint project from Accenture, Intel and therefore the Sulubaaï Environmental Foundation is using AI to watch the resiliency of coral reefs within the Philippines.

Project: CORaiL uses hardware from Intel in an Accenture-designed AI-powered underwater camera system to watch marine life around the reef, a key indicator of reef health. within the pilot phase, a prototype camera was deployed on the reef at Pangatalan Island. Since deployment in May 2019, it’s taken quite 40,000 images.

AI
The coral reef surrounding Pangatalan Island in the Philippines is the subject of an AI-powered underwater camera pilot project to monitor the reef’s regrowth (Image: Accenture)

Coral reefs around the world are rapidly perishing thanks to a mixture of overfishing, bottom trawling, and warming ocean temperatures. These reefs are an ecosystem for 25% of the planet’s marine life, and without them, miles of coastline become susceptible to tropical storms. They also provide food and income for 1 billion people and generate $9.6 billion in tourism and recreation, consistent with figures provided by Intel.

 

Camera System

The first a part of Project: CORaiL saw the installation of a concrete reef prosthesis developed by Sulubaaï. This prosthesis is an underwater concrete structure that includes fragments of living coral, encouraging coral regrowth.

Reef prosthesis
Concrete reef prostheses encourage coral regrowth in areas where the reef is badly damaged (Image: Accenture)

The underwater system uses AI, specifically a convolutional neural network (CNN), to classify and count fish in images taken automatically by the camera. This information is employed in analysis of the abundance and variety of marine life on the reef to watch the reef’s progress because it begins to regenerate. This work wont to believe human divers with video cameras capturing footage which is analysed later, but they typically can only film for a half-hour at a time. A permanent in-situ camera can give researchers 24/7 access to real-time data via a 4G wireless connection.

In the pilot stage of the project, there’s one camera which is moved once every week to capture different angles and locations. The camera system is provided with the Accenture Applied Intelligence Video Analytics Services Platform (VASP), which uses an Intel Xeon CPU, Intel FPGA and Intel Movidius vision processing unit (VPU).

AI
The underwater camera system used by Project: Corail uses an Intel Movidius VPU to accelerate a neural network that does motion detection and fish counting (Image: Accenture)

“The Movidius VPU was wont to accelerate the CNN to spot marine life under water. The centralized server with Xeon CPU and Intel FPGA was wont to do the heavy-lifting for classification of species,” said Ewen Plougastel, director and ASEAN delivery lead at Accenture Applied Intelligence. “[The system] at the sting [can process] about 2 frames per second from the camera. However, the central server is capable of processing the input from tens of cameras simultaneously.”

Count and Classify
Custom CNNs perform motion detection, classification and counting of marine life.

“Our data gathering and model training was custom-developed in two steps. First, a classic motion detector computer vision [network] to detect movement, and second, to wash and annotate the dataset to coach a CNN model,” Plougastel said.

The next generation of the camera system will include an extra optimised CNN and a backup power supply. The partners also are considering an infra-red camera to raised capture nocturnal marine life.

“The same algorithm is going to be used and adapted to the infra-red camera,” Plougastel said. “The advantage of the infra-red camera is that it detects the size/mass of the marine life, which is a crucial input that scientists need.”

Future uses for the technology could include studying the migration rates of tropical fish and monitoring intrusion in protected underwater areas, like reefs.