Become a member

Get the best offers and updates relating to Liberty Case News.

― Advertisement ―

spot_img

Windows Recall faces further delays as Microsoft still can’t deliver a secure and trusted experience

Microsoft now plans to start publicly testing Recall on Windows in December. The company says that extra time is needed to implement a secure...
HomeTechnologyGoogle looks to AI to help save the coral reefs

Google looks to AI to help save the coral reefs


Google has developed a new AI tool to help marine biologists better understand coral reef ecosystems and their health, which can aid in conversation efforts. The tool, SurfPerch, created with Google Research and DeepMind, was trained on thousands of hours of audio reef recordings that allow scientists studying the reef to be able to “hear reef health from the inside,” track reef activity at night, and track reefs that are in deep or murky waters.

The project began by inviting the public to listen to reef sounds via the web. Over the past year, visitors to Google’s Calling in our Corals website listened to over 400 hours of reef audio from sites around the world and were told to click when they heard a fish sound. This resulted in a “bioacoustic” data set focused on reef health. By crowdsourcing this activity, Google was able to create a library of new fish sounds that were used to fine-tune the AI tool, SurfPerch. Now, SurfPerch can be quickly trained to detect any new reef sound.

Image Credits: Google

“This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these,” notes a Google blog post about the project. The post was co-authored by Steve Simpson a professor of Marine Biology at the University of Bristol in the U.K., and Ben Williams, a marine biologist at the University College London, both who study coral ecosystems with focuses on areas like climate change and restoration.

What’s more, the researchers realized they were able to boost SurfPerch’s model performance by leveraging bird recordings. Although bird sounds and reef recordings are very different, there were common patterns between bird songs and fish sounds that the model was able to learn from, they found.

After combining the Calling Our Corals data with SurfPerch in initial trials, researchers were able to uncover differences between protected and unprotected reefs in the Philippines, track restoration outcomes in Indonesia, and better understand relationships with the fish community on the Great Barrier Reef.

The project continues today, as new audio is added to the Calling in Our Corals website, which will help to further train the AI model, Google says.



Source link