AI

Whale songs and AI, for everyone to explore

Back in the 1960s, scientists first discovered that humpback whales actually sing songs, which evolve over time. But there’s still so much we don’t understand. Why do humpbacks sing? What is the meaning of the patterns within their songs?

Scientists sift through an ocean of sound to find answers to these questions. But what if anyone could help make discoveries?

For the past year, Google AI has been partnering with NOAA’s Pacific Island Fisheries Science Center to train an artificial intelligence model on their vast collection of underwater recordings. This project is helping scientists better understand whales’ behavioral and migratory patterns, so scientists can better protect whales. The effort fits into Google’s AI for Social Good program, applying the latest in machine learning to the world’s biggest humanitarian and environmental challenges.

NOAA research oceanographer Ann Allen and Google software engineer Matt Harvey work together to field test the algorithm aboard a research vessel.

NOAA research oceanographer Ann Allen (left) works onboard a research vessel and Google software engineer Matt Harvey (right) field tests the algorithm.

Now, everyone can play a role in this project using a website called Pattern Radio: Whale Songs. It’s a new tool that visualizes audio at a vast scale and uses AI to make it easy to explore. The site hosts more than 8,000 hours of NOAA’s recordings, which means scientists aren’t the only ones who can explore this data and make discoveries. Everyone can.

Zooming in on the spectrogram shows you individual sounds.

Zooming in on the spectrogram shows you individual sounds. 

On the site, you can zoom all the way in to see individual sounds on a spectrogram (in addition to humpback songs, you can see the sounds of ships, fish and all kinds of mysterious and even unknown noises). You can also zoom all the way out to see months of sound at a time. An AI heat map helps you find whale calls, and visualizations help you see repetitions and patterns of the sounds within the songs.

Highlights help visualize patterns and repetitions of individual sounds within the songs.

Highlights help visualize patterns and repetitions of individual sounds within the songs.


The idea is to get everyone listening—and maybe even make a totally new discovery. If you find something you think others should hear, you can share a link that goes directly to that sound. And if you need a bit more context around what you’re hearing, guided tours from whale song experts—like NOAA research oceanographer Ann Allen, bioacoustic scientist Christopher Clark, Cornell music professor Annie Lewandowski and more—point out especially interesting parts of the data.


You can start exploring at g.co/patternradio. And to dive even deeper, learn more about the project at our about page and check out Ann Allen’s article on how this whole project got started on her NOAA Fisheries blog. Jump on in!