In the azure waters off the west coast of New Zealand, matched only in brilliance by the blue skies above, a pod of rare dolphins skims below the surface.
Every so often they breach the boundary between ocean and air, splashing through a white foam portal before disappearing below.
To the observer, they’re having a blast. But their story is one of rapid decline.
They are Māui dolphins, a sub-species of the Hector’s dolphin found only in this part of the world and marked out by their white, grey and black markings and rounded dorsal fin – often likened to Mickey Mouse ears.
However, while they may be easy to identify, they are becoming increasingly hard to find. Pollution, climate change and fishing in the region has pushed them to the edge of extinction – they are not the target, yet often become caught in nets.
Now, just 54 remain.
That may seem an oddly specific number for a group of animals living in the wide expanse of the sea, especially small, fast-moving ones that famously all look very alike. The reason we know is all down to artificial intelligence (AI).
Yes, the same AI that many worry is coming to steal all our jobs and ultimately bring about humanity’s demise. Until then at least, it’s doing some good.
Traditionally the Māui dolphins were tracked only once every five years by the department of conservation and University of Auckland scientists, counted during a three-week window in the summer when they swam close to the shore.
However, that monitoring did not provide a clear enough picture of either how the dolphins were faring, or what could be done to help them.
To solve the issue, conservation charity MAUI63 – named so because when it was formed in 2018 there were still 63 dolphins – has turned to the latest technology.
In partnership with the Ministry of Primary Industries (MPI) and the fishing industry, they have an developed AI-powered tracking drone to autonomously find, follow and identify the remaining Māui dolphins using image recognition technology.
‘Unlike boat-based work, the drone is not limited by ocean swell and rough coastal waters as it flies over them,’ MAUI63 co-founder Tane van der Boon tells Metro.co.uk.
‘The mission is to move technology forward to help with the conservation of wildlife, and Māui dolphins are one of the most urgent conservation problems we have in New Zealand. The drones are equipped with an 8K ultra high-definition still camera and a full HD gimbal camera with an object detection model for spotting dolphins.
‘This open-source algorithm was originally developed for facial recognition. Hosted on Microsoft Azure, it is able to identify individual dolphins by the shape and size of their dorsal fins and the unique markings on them such as scratches and scars.’
Without the limitation of waiting for safe sailing conditions, the drones can be deployed far more regularly – the team is now able to monitor the dolphins on a monthly basis, all year round.
Simply knowing how many dolphins are left does not protect them, however. The team uses the data to provide the fishing industry with up-to-date locations, so they can avoid those areas.
Toxoplasmosis, a disease-causing parasite, is also a major threat to the dolphins – despite originating in cat faeces.
‘It enters the marine food chain through runoff from the land, causing stillbirths and deaths of some species of marine mammal, including Māui dolphins,’ explains Tane.
‘We can use insights generated by the drones about dolphins’ preferred habitats. This may help narrow down studies into how toxoplasmosis might be entering those areas of water and how to limit the spread of the disease.’
Closer to home, similar animal recognition technology is taking the legwork out of counting puffins on the Isle of May.
Traditionally, getting a handle on numbers of these ‘parrots of the sea’ has been a very hands-on job, literally, with rangers not only counting by eye, but also digging around in burrows to count eggs – often receiving a nasty nip on the fingers for their efforts.
But a new project run by SSE Renewables, supported by Microsoft, Avanade and NatureScot, is using cameras and AI to monitor and count the birds all day, every day using image recognition software taught to pick out puffins without having to disturb them.
And while the increasing use of facial recognition is not without significant ethical concerns, raising issues of privacy and anonymity for humankind, when it comes to identifying these most loveable of birds, there’s little reason to complain.
When it comes to AI recognition though, it’s not just about looks.
National Rail, in partnership with the Zoological Society of London and Google Cloud, has developed remote sensing technology that not only uses cameras, but also audio to detect the presence of wildlife – in particular, the hazel dormouse.
The tiny, elusive creature, once widespread, is now extinct in 20 counties across the country, having suffered a 70% decline in numbers just this century.
‘Remote sensing technology such as camera traps is key to letting wildlife come to you, rather than having to physically go out and find different species,’ Network Rail’s biodiversity strategy manager Neil Strong tells Metro.co.uk.
‘Dormice have a very quirky lifestyle, and the way we identify their location involves tracking their nuts – hazelnuts – and the nests they create, which a lot of other species also like to use. The AI helps make our tracking style much more efficient across different species, and we no longer have to visit every nest box or sift through hours of footage.’
Altogether, Network Rail manages land totalling close to 50,000 hectares – around one and half Isles of Wight – and two thirds of this is green space. Neil and his team want to ensure the area performs as well as it can for both people and animals.
‘It’s not just about trains getting people from one place to another,’ he explains. ‘It’s about a linear connection between habitats and doing everything we can to make that as wildlife friendly as possible, while also bearing in mind that it is still a railway, meaning we have to manage it appropriately.’
Managing land, and the often uneven balancing act between conserving habitats and producing food, is another area in which artificial intelligence is proving vital.
Across the world, the Rainforest Alliance is using AI-generated mapping to better understand where the forest is intact, and where it has been converted to grow products such as rubber, coffee and cocoa.
The charity uses its AI remote sensing forest data to firstly map areas at risk of deforestation. It then helps farmers collect crucial geo-coordinates for their own land, to help meet EU criteria proving traceability.
In short, they can prove their products have not come from high-risk areas.
‘Smallholder farmers, who represent more than 90% of Rainforest Alliance producers, are the economic backbone of many countries relied upon by the EU for commodity imports,’ says Michelle Deugd, director of forests and agriculture at the Rainforest Alliance.
‘Loss of access to the EU market could result in severe social and economic repercussions for communities dependent on export revenues to fulfill their basic needs. If this were to happen, smallholders may be compelled to resort to encroaching further into forests – including protected forests – and sell their products to less discerning consumer markets, in order to provide for their families.’
For those living and working in the rainforest, conservation is very much a matter of survival.
Eyes in the sky
Think AI and IBM, and a chess-playing computer may be the first thing that springs to mind.
But in 2024, the AI pioneer is trying to solve a very different puzzle – environmental degradation.
One programme, in partnership with Nasa, is using satellite data to measure the success of reforestation efforts in Kenya’s famed ‘Water Towers’, a complex of forests across the country that play a vital role in capturing rainwater to feed more than a dozen major rivers – including the Nile.
By combining satellite data with IBM’s AI technology, authorities are now able to measure and quantify the impact of reforestation efforts and adapt them accordingly.
‘[Tree planting] can be very difficult to measure, very difficult to understand the impact,’ says Juan Bernabe-Moreno, IBM director of research for the UK and Ireland.
‘That’s why we are so happy to see that our foundation model can offer a completely different angle.’
Monitoring from above, and over time, highlights anomalies.
‘For example, in one particular area, the tree planting was going really, really well,’ says Juan. ‘But right beside it, it didn’t feel like there was much improvement.
‘It turns out, there was a fence protecting the more successful area [keeping out animals that may eat young trees], so you really get to see small things and how they have an impact on the ground just by using satellite data.’
The same is true on a global scale. The decline in plant and animal species is a crisis on par with climate change, but has yet to hit the headlines in the same way.
Encouraging everyone to engage with nature, not just those whose livelihoods depend on it, is the aim of conservation charity On The Edge.
Here, AI is put to use slightly differently.
‘Our goal at On the Edge is to emotionally connect people with nature,’ says Rob Slade, director of digital content. ‘We capture the hearts of Gen Z audiences by crafting original, engaging content where the natural world plays the starring role. Just like this tech-savvy generation, we leverage AI to push the boundaries of content creation.’
A quick scan through the charity’s Instagram showcases the vibe they’re curating while pushing those boundaries. Fun.
Think flying camels, a mustachioed pigeon at the barber’s and a Barbie cockatoo.
On a more serious note – not that engaging with nature isn’t serious – the team also uses AI to track attitudes towards EDGE species, those that are evolutionarily distinct and globally endangered.
Very simply, the programme uses natural language processing (NLP) to analyse content from across the web, such as social media posts and news stories, to understand and potentially shift public attitudes and policy sentiment to better protect EDGE species.
But there’s a time and place for AI in conservation.
One thing On The Edge does not do however, is use AI to generate images of real animals.
‘We wouldn’t want anything to diminish that awe-inspiring feeling, especially not confusion over whether what they’re seeing is real or not,’ says Rob.
‘So, while AI is a powerful tool, we use it responsibly, ensuring it enhances the magic of nature, not replaces it.’
MORE : Zoo staff horrified as lion rips out neck of lioness he was meant to mate with
MORE : Farmer tried to breed ‘huge mutant sheep’ and now he’s facing jail
MORE : Snakes on a plate – scientists tout pythons as food of the future
Get your need-to-know
latest news, feel-good stories, analysis and more
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.