Artificial Intelligence Is Watching Wildlife

Artificial intelligence is assisting wildlife conservation efforts in species from trout to whales while raising some watchdog eyebrows

  • By Andrew Vietze
  • Conservation
  • Mar 28, 2024

Wildlife researchers have long monitored the sounds of humpback whales (above). Now AI is helping scientists pinpoint the mammals’ songs (seen in the spectrogram at left) in audio recordings from the fathoms. (Photos courtesy of Project by NOAA and Google, inset, and Ralph Pace/Minden Pictures, obtained under NMFS Permit 19225.)

WILDERNESS AND TECHNOLOGY make strange bedfellows, each a seeming refuge from the other.

And yet, artificial intelligence, or AI, is increasingly helping wildlife conservationists monitor species, conduct population studies and make decisions about habitat protection—speeding up research by orders of magnitude over just a couple of decades ago. Tools such as digital neural networks, which process data in ways similar to the human brain, and computer vision, which uses machines to categorize images, are what “conservation needed to keep up with the high data volume now available in wildlife monitoring devices,” says Jason Holmberg, founder of AI developer Wild Me.

Others aren’t sold. “It’s crucial to avoid the trap of techno-solutionism, where we try to put AI everywhere at all costs, including in places or contexts where it’s not necessary or helpful,” says Nicolas Miailhe, founder and chair of AI watchdog The Future Society.

What does it all mean? Read on to learn some of the ways AI figures into current conservation efforts and what’s next.

Read the Caption
An image of a brook trout with parr marks circled for AI to identify individual fish.

Trout Spotter uses AI to distinguish one trout’s distinctive parr markings, circled above, from another.

Connecting the spots

Camera traps have been around since the late 19th century, when U.S. Representative and nature photographer George Shiras set up a trip-wire camera to capture deer on Michigan’s Whitefish Lake. By the internet era, motion-sensor cameras had reached a new level of sophistication, with integrated computers identifying and quantifying species.

“But what’s truly the frontier right now,” says U.S. Geological Survey biologist Nathaniel Hitt, “is not the identification of species but the identification of individuals.” In Hitt’s case, those individuals are brook trout, but he relies on the same—often controversial—technology that’s used in human facial recognition. “Identifying individuals is necessary, of course, for conservation biology,” he says. “If you want to estimate trends on abundance, you need to know if you’re counting the same fish more than once.”

Hitt, who grew up hiking and fishing not far from his office at the USGS Eastern Ecological Science Center in West Virginia, specializes in fish ecology from a landscape perspective, studying what fish can tell us about the health of our streams and rivers. He finds brook trout—some populations of which are anadromous, spending part of their lives in both fresh and salt water—especially compelling, and he’s worried about their declining numbers.

“Native brook trout are our canaries in the coal mine for climate change,” he says. To survive and thrive, brookies, as anglers call them, need water that’s unpolluted, clear, oxygenated and cold. “We have a high confidence that when temperatures chronically exceed 20 degrees C or 68 F, the trout populations disappear,” Hitt says.

Trout habitat stretches far beyond West Virginia, all the way to the Great Lakes and Canada—an area so vast, it’s difficult for scientists to track big-picture trends. “No single organization or entity or research group has the ability to monitor populations of all these important places that we know people care about. It’s just not possible,” Hitt says. “And this is the innovation: You don’t have to do it alone. You have a social network. There are people out on the landscape, if you can give them the tools.”

Enter Trout Spotter. The software grew out of a project Hitt worked on with biologists from Shenandoah National Park, in which underwater cameras captured video of brookies. USGS collected the footage in a databank of salmonid mug shots that students then used to identify individual fish. Eventually, to replace long hours spent staring at monitors, University of Virginia graduate students developed algorithms that could read the markings on trout skin. “It turns out that these spot patterns are individually diagnostic,” Hitt says. “They're essentially a fingerprint or a thumbprint.”

USGS is partnering with Wild Me and Trout Unlimited, a fishing and conservation organization, to beta test an app developed from the software. When a Trout Unlimited member catches a fish and snaps and uploads a photo, the algorithm searches to see if it knows that particular trout. If it does, it sends an email to the angler, sharing the fish’s past catches and logging the current location for further study.

“We’re empowering anglers with a deeper sense of connection, and we’re building an unprecedented view of the health of these [trout] populations from a quantitative and statistical perspective,” Hitt says. He hopes to release the app to the public later this spring.

Read the Caption
A collage that consists of an image of a lion's features identified by computer vision and a graphic showing the unique pattern of a whale shark's spots.

Computer vision correctly identifies the subtle features of the same lion in two different photos (top) from a Wild Me database of hundreds of images of the cats. A whale shark’s spots (bottom) form a unique pattern—not unlike a human fingerprint—that can be used to match photos of an individual fish taken years apart.

From shark to ark

Trout Spotter builds on a foundation laid by Portland, Oregon’s Wild Me. Avid diver Holmberg came up with the idea for his nonprofit AI development studio while swimming with whale sharks off the east coast of Africa in 2002.

Realizing he could recognize individual fish, Holmberg wondered why scientists needed to tag the sharks, penetrating their bodies and potentially stressing them out, when the fish were identifiable by their markings. Then an employee of the tech giant Dell, Holmberg thought he could teach computers to distinguish whale shark markings, eventually using AI to amass the same kind of tag-and-track data biologists had been collecting for decades—on health, distribution and habitat—at a much larger scale.

Listed as endangered by the International Union for Conservation of Nature in 2016, whale sharks—the world’s largest fish, at upwards of 30 feet long—are declining due to boat strikes, entrapment in fishing nets, pollution and human predation for meat and oil. Holmberg was also interested in whale sharks’ utility, like trout, as an indicator species. The fish, whose range spans great distances, require massive quantities of plankton—a category of organisms scientists have long used to gauge ocean health. Plus, a ready-made crowd of divers, snorkelers, tour operators and scientists were already taking photos of whale sharks all over the world.

Upon launching (now in 2003, Holmberg immediately found himself deluged. With digital photography exploding, records of sightings were arriving every which way, in all formats: thumb drive, CD, VHS tape and more. To control the chaos, Holmberg soon required all sightings to be submitted through a form on the website, creating a central repository that was actually useful.

“Forcing standardization meant everyone’s data became comparable. You could start ... seeing other people’s whale sharks, comparing them and matching them,” he says. “And so, all of a sudden, there’s this no-cost way to enter into whale shark research and then, importantly, to collaborate and look at a global whole of data.”

In the two decades since Wild Me’s founding, contributors have shared behavioral data on more than 15,500 whale sharks. Sightings of the same single fish off the coasts of Mexico, Belize, Honduras, the United States and Cuba “showed us that whale shark conservation and research in the region must be collaborative and linked to get the best picture of whale sharks’ life histories and key migration corridors,” Holmberg says.

Computer vision could work for other species, he realized—an epiphany that yielded an array of spin-offs. “Then came MantaMatcher and then Giraffe Spotter, and it just kept cascading and continues to cascade,” he says. In January, Wild Me announced it was merging with Conservation X Labs (CXL), another environmental tech nonprofit with projects ranging from handheld genetic testing devices aiming to counter illegal wildlife trafficking on the front lines to Persian and snow leopard conservation efforts in Asia.

Now under the CXL umbrella, Wild Me hosts a network of open-source sites and cross-platform projects monitoring nearly 200,000 individual animals from 53 species. There’s Flukebook for whales and dolphins; deer and seal codices; Amphibian and Reptile Wildbook; and Wild North Wildbook for cougars, bobcats, skunks and Canada lynx—with Trout Spotter coming soon. Each project marries the crowdsourced spirit of widely used apps like iNaturalist and Seek—which run on different software, Holmberg says, and which identify species, rather than individual animals—with filtering that meets rigorous academic standards, including confirmation of every data point by a human.

All the data have led to a host of insights published in scientific journals from Ecology to the Journal of Fish Biology. African wild dogs, for example, travel farther than previously thought: upward of 1,000 miles. Manta rays do, too, while sand sharks seem drawn to shipwrecks, and Southern right whales are rallying in New Zealand calving grounds.

Holmberg believes AI’s next step is to recognize whole arks of creatures at once. “Rather than learning one species at a time, on small data, it’s learning many species at a time, on much larger data,” he says. Similar to AI large language models like ChatGPT, the more data a tool houses—in Wild Me’s case, images of animals—the smarter it becomes, improving its ability to predict and identify wildlife.

With more and better data, Holmberg sees the opportunity for endless advancements, such as camera traps that record audio as well as video and “intelligent agents” that comb social media for known animals among the photos and videos people post. “I think we’ll get to the point where, not only can we have a nonhuman actor collecting and curating data, but also summarizing and explaining data and finding trends that we might otherwise miss,” he says.

Read the Caption
A collage that consists of an image of a humpback whale and a spectrogram of a whale song from Pattern Radio Player.

Knowing where humpback whales have sung in the past could help scientists identify future changes in behavior.

Music of the abyss

Using audio to monitor undersea wildlife isn’t new. While listening for Russian submarines in the 1950s, U.S. Navy sailors realized humpback whales were singing to one another—an observation they kept secret until the late 1960s, for fear commercial fishers would use the technology to hunt and kill the mammals. Acoustic monitoring has been marine biologists’ tool of choice for decades, according to Ann Allen, an oceanographer with the National Oceanic and Atmospheric Administration. But there’s always room for innovation.

When Allen took her job at the Pacific Islands Fisheries Science Center’s Cetacean Research Program in Honolulu, Hawai‘i, in 2017, she inherited almost 170,000 hours of digitized recordings captured by ocean-floor-mounted arrays over a dozen years: the humpback songs she was seeking but also the sounds of other whales, dolphins, boats and waves—anything happening undersea. Overwhelmed by the prospect of digging through it all to locate humpbacks, she mentioned the conundrum to her dad, who suggested society’s default approach to questions these days: Google it.

“Which I laughed at, at the time, but then thought it was worth a shot,” she says. Allen contacted a friend at Google and was put in touch with the company’s AI arm. She sent coders sample recordings of the sounds she wanted, and before she thought possible, she had time stamps on her data, showing her where to find the humpbacks.

Although the voices of individual whales are not yet distinguishable, the recordings have yielded data on seasonality, singing patterns and populations. Allen says knowing where and when whales have sung in the past will help scientists understand whether humpbacks have changed their distribution patterns over time—for example, moving farther away from human activity. She’s excited about other projects, too, including using AI to identify North Atlantic right whales and to decode sperm whale language. “There are advancing uses on all fronts of marine mammal science using AI [and machine learning],” she says.

“We want [artificial intelligence] to serve humanity and wider conservation and not the other way around.” —Nicolas Miailhe

Monitoring the monitors

Wherever there is AI, there are critics, and conservation is no exception. Miailhe of Paris-based The Future Society makes his living monitoring technology for governments, academia and industry. As co-chair of the think tank Global Partnership on AI’s Committee on Climate Action & Biodiversity Preservation, he watches the woods and waters with keen interest.

“Does technology have a place in the wild?” he asks. “Yes. We want the machines, the algorithms, the data and the applications, services and products built on top of them to serve conservation science and practice.” But, he stresses, several factors must be considered before introducing high-tech tools to natural spaces. Is a gadget necessary, or will a lower-tech alternative do? Does the tool respect the needs and best interests of the wildlife it studies? To date, AI regulation worldwide is scarce, but at least a dozen U.S. states have enacted legislation requiring further government research and oversight. (Read the National Wildlife Federation’s guidance on innovation and risk.)

Miailhe agrees computer vision could serve as a less intrusive substitute for traditional tagging and tracking. But he asserts that digital wildlife monitors must themselves be monitored—and not only by the tech-friendly scientific community but by locals in the wild places under surveillance.

“It would be dangerous and counterproductive to remove the communities of practice—the rangers, conservation experts and amateurs—out of the equation of responsible AI adoption and have an overbearing set of technologies deciding everything in a vacuum,” he says. Technology can make mistakes and output inaccurate data, Miailhe points out. It can deliver endangered animals’ locations to poachers. And it can impinge on the privacy of local human residents.

Even so, Miailhe sees the pros outweighing the cons in the context of wildlife research—as long as AI tools are used responsibly. “We want them to serve humanity and wider conservation,” he says, “and not the other way around.”

Read more about writer Andrew Vietze.

More from National Wildlife magazine and the National Wildlife Federation:

Eyes in the Sky »
Sustaining Fresh Waters »
Blog: When Nature and Digital Screens Click »
Zoonomia’s Genomics of Scale »

Get Involved

Where We Work

More than one-third of U.S. fish and wildlife species are at risk of extinction in the coming decades. We're on the ground in seven regions across the country, collaborating with 52 state and territory affiliates to reverse the crisis and ensure wildlife thrive.

Learn More
Regional Centers and Affiliates