On Roskilde Fjord, an inflatable boat sails around with a flat plate mounted underneath. With its upward-facing metal posts, it looks like an inverted folding table crudely put together with parts from a DIY store. But the makeshift surface is hiding innovative high-tech hardware that may help shed light on the state of marine biodiversity.
Though 71 per cent of our blue planet is covered in water, we know very little about what is hiding in the depths. Scientific studies estimate that 91 per cent of marine species have not yet been identified, and we therefore have a blind spot in our knowledge of the state of marine biodiversity. A group of researchers from DTU and Aarhus University aims to change that.
Will replace divers and drones
Counting species and stocks below sea level is not an easy task. Until now, divers in combination with drones or satellite imagery have been used to getting an overview from the air, while complicated calculations attempted to make up for the unreported figures.
"It's just really difficult and expensive to document biodiversity under water," says Christian Pedersen, Professor at DTU Electro.
He is heading the Ocean Eye project, which is developing new sensor technology. They use a combination of hyperspectral cameras (we will get back to those shortly), lasers, and artificial intelligence to make easier and more accurate quantifications of the biodiversity of coastal waters. Everything is installed on an autonomous vessel, which can sail around and collect data on its own.
"A diver can only be under water for a limited time, and it's hard to cover a large area. But with our method, we expect to be able to quite accurately say that, for example, 37 per cent of the seabed in this area is covered by eelgrass and 12 per cent by red algae," says Christian Pedersen.
Fluorescent fingerprints
Ocean Eye will primarily collect data from the seabed by analysing plants and animals that exist at depths below satellite reach such as red algae, starfish, and corals.
"If there are no plants and animals at the bottom, fish and other species won't be able to live there, which is why it's so important to track how the seabed is doing," says Christian Pedersen.
Christian Pedersen and his DTU colleagues are therefore in the process of developing a special hyperspectral camera. Whereas traditional drones typically use so-called RGB cameras that can only see three colours (red, green, and blue), a hyperspectral camera takes up to 30 images at a time, each in its own colour. This means that one photo only shows red tones, the next yellow tones, etc. The result is much clearer images in which red algae, for example, stand out clearly in the red images, making it easier to quickly analyse and classify the objects visible on the recordings.
Ocean Eye will simultaneously sweep a laser beam over the ocean floor and measures the fluorescence of the objects it hits. When hitting marine organisms with shortwave laser light, they bounce back part of the light depending on their pigmentation, and the colour of the glare reveals the species—like a fluorescent fingerprint.