Talking With Bots

Chatbots Can Help Us Talk to Animals

Could applying the Internet of Things to animals someday allow us to talk to them?

6 min read

A lizard on a wall, with the word "you" written on the wall in red paint.
Image credit: Dorian Kartalovski

“Don’t be afraid. Enjoy yourself. It’s not forever.”

If I could talk to my dog while he’s at the kennel, that’s what I’d say. Eight short, simple words–a few huge, abstract concepts. Of all the mind-blowing applications offered by the so-called “Animal Internet,” I keep coming back to this one emotional exchange.

What if–let’s indulge the idea a little further–my dog could reply? What would he say back? At this point, my imagination fails. Some deeply ingrained aversion to anthropomorphism won’t let my thoughts complete their journey.

But this scenario is no longer a frivolous fantasy. Advances in technology, from incredibly shrinking hardware to machine learning, along with new understandings of animals as individuals with complex characters, are ushering in a “bestial brave new world:” a world where humans can communicate with animals on levels we’ve barely allowed ourselves to dream about. (Who knows what dreams the animals have allowed themselves?)


It was German author and journalist Alexander Pschera who coined the idea of the Animal Internet in a book of the same name, subtitled Nature and the Digital Revolution. Like the Internet of Things–which gives internet connectivity to everyday devices so that, for example, a health monitor can notify an individual (and her doctor) of changes to key indicators such as blood pressure remotely–the Animal Internet allows us to track the location of an animal, along with other physiological and visual data, by attaching small sensors to the creature. Those sensors in turn talk to satellites orbiting Earth. Though the tracking of animals has been around since the 1960s, it’s only now that we have hardware small and light enough, signals strong enough, and software clever enough for this kind of sophisticated paradigm to emerge.

Once we have the data provided by these complex systems, we can program artificial intelligence (AI) to interpret them, allowing us, at the front end, to follow the tribulations of a migrating bird via social media or read blog posts by a deer living deep in the forest.

For example: Say a deer was fitted with a heart-rate monitor and location tracker, and its habitat rigged with motion-sensitive cameras–all of which were net-connected and talking to each other. From the data gathered, we could program software to recognize different events in the animal’s life–a chase, perhaps, or a birth. A bot could then compose a blog post about it, compiling the selected data into a written narrative with images and uploading it to Bambi’s Medium page.

“To me, this is the most exciting aspect of the Animal Internet”¦ ” said Pschera, “to be able to engage with an individual animal, like following a friend on Facebook. Experiencing the daily trials an animal has to live through, and discussing it with other people, is a very powerful thing.”

There are, of course, challenges involved. Technology can provide us with swathes of data, but if we aren’t asking the right questions in the first place, it’s useless. For instance, the deer’s data may show that the animal’s gotten separated from its herd and is moving at great speed with an increased heart rate. Can we say that the animal is feeling fear? Or would that be an anthropomorphisation that’s gone too far?

“There’s a whole branch of animal welfare research focused on animal emotions,” said Dr. Clara Mancini, head of the Open University’s Animal-Computer Interaction (ACI) Lab. “Scientists try to triangulate physiological data with behavioral data, for example using EEG devices that can see what’s happening in the brain. We’re slowly coming closer to being able to make better assumptions, but we still have a way to go. Using the data we have currently ought to be done carefully, because it’s not a given. There’s a danger we could misinterpret things.”

Currently, for the most part, humans are still involved in mediating the data. In the case of the northern bald ibis, field expert Martin Wikelski is leading a compelling reintroduction initiative at Germany’s Max Planck Institute for Ornithology in which small solar-powered transmitters register the positions of the birds every hour. Once each day the sensors send their data to an online database, where humans pore over it. Selected highlights are posted (in German) on Facebook.

A bird flying through the sky.
Northern bald ibis. Image credit: Waldrapp team.

According to Pschera’s book, Wikelski is “planning to equip birds’ beaks with tiny cameras that are triggered by characteristic head movements during feeding”¦ [allowing] the animals’ daily menu to be recorded in high definition.” In this case, it would be pretty straightforward to program a bot to post the birds’ breakfast photos to an Instagram account. (We humans had better up our game; there’s some serious avian competition on the horizon.)

Another intriguing application of the Animal Internet is its role in predicting disasters. We already know that certain animals, more in tune with their environment than we are, make mass movements ahead of, say, an earthquake. If we tag enough individual animals, AI could alert us to locational anomalies that might save entire cities. Imagine a bot that WhatsApps you with a warning to evacuate your office building, based on the behavior of a bunch of toads.


It’s hard to disagree with Pschera’s argument that humans have never been so far removed from the natural world than we are today. “Civilization has become a counterpart to nature,” he said. But technology is changing all that. In giving animals a digital voice, bots can bring us closer to them. “Our basic understanding of nature as something wild and different from man is losing its meaning,” said Pschera.

We’re still, however, not at the point of true dialogue. “Communication between humans and animals is made possible only by the machines or programs that use artificial intelligence from data to make connections and tell stories. The Internet is a fabulous interface but nothing more than an interface and, as such, a facade that interrupts,” Pschera wrote in his book, before asking: “Can it be more?”

The first step, he suggests, is to figure out how an animal’s mind works. And that’s hard–at the moment we have no context for interspecies communication. “It would be a gigantic step to move into this sphere,” Pschera noted, “where we can make assertions about the way an animal behaves–not on the basis of its species but on the personal history of the individual: how its life experience informs particular decisions.”

It’s not just technology that is pushing this approach forward. Historically, we’ve tended to view animals in terms of species, rather than individuals. But the contemporary importance of individuality among humans has spread to a new way of looking at the animal world. “Following movements like the Quantified Self, and social media more generally, as a culture we’re coming around to the idea that animals, too, have a unique and complex character,” said Dr. Mancini.

For her part, Dr. Mancini is wary of subjectifying animals. “Instead, we are trying to make the animal a contributor and participant: not just out of the data, but also in terms of them expressing preferences,” she said.


Other ACI projects are exploring different angles. As part of the Wild Dolphin Project, dolphin researcher Denise Herzing teamed up with AI researcher Thad Starner to create the beautifully acronymed Cetacean Hearing and Telemetry (CHAT). CHAT is an underwater device which uses a complex algorithm to play artificial whistle sounds corresponding to objects familiar to dolphins. The system is able to translate these sounds back into English, into the ears of researchers. Since dolphins are famously good mimics, the idea is that they might learn these “words.”

In 2014, the team had a breakthrough. While in the water, Herzing heard the word “sargassum” (a type of seaweed) via her CHAT headphones. This meant that “essentially”¦ the computer system heard and recognized the incoming whistle for sargassum and was triggered to say the word ‘sargassum’ in my ear. Since there was not, at this time, a second CHAT box in the water the only explanation is that a dolphin made the whistle that triggered the word in my ear,” she revealed in a blog post. The Wild Dolphin Project is currently raising funds to develop the next iteration of its hardware and analysis tools.

Meanwhile, the dolphin communication subfield is also populated by members of the SETI (Search for Extraterrestrial Intelligence) community, such as the Interspecies Internet (I2I), which applies information theory to analyze animal communication with a view toward eventually making contact with aliens. Billed as an “idea in progress,” the project looks to logarithmic relationships among “word” frequencies in various animal sounds to determine patterns that suggest some kind of linguistic structure. Suffice it to say, such a formula would prove hugely useful when scanning space for extraterrestrial chatter.

Image credit: Talia Cohen

Mancini is also interested in exploring ways for other species to communicate with us, though she’s set her sights closer to home. “Imagine I was wearing a device, which received information from a monitor attached to my dog. What if, rather than looking on a screen at her heart rate, temperature, and so on, I felt a sensation in my body: How would that change the way I tune into her? I won’t necessarily understand exactly what she’s feeling, but I could perhaps attend to her in a way I wouldn’t otherwise.”

A pragmatist, she cleverly circumvents the elephant in the room: the lack of shared language that keeps us, for the time being at least, remote from our four-legged friends. “Touch is very intimate and more primordial in a way. This isn’t an assumption based on data, but a tuning in,” she said.

After all, communication is not solely verbal, so why assume that our tech-enhanced interaction with animals will be? As Mancini pointed out, “A shared language matters less than the ability to listen.”


Wordless but meaningful. Maybe that’s how my dog will reply from the kennel: Once I’ve haptically offered him that sentiment of reassurance, I’ll receive a faint rumbling in my stomach to signal reluctant acceptance–or outright rebellion. Between us will be a machine, an artificial medium offering a new way to interpret the complexity I already see when I gaze into his familiar, intelligent eyes.

As we deconstruct the boundaries between humans and animals, technology will surely hold keys to unlocking new forms of interspecies communication. Though the methods are innovative, they serve only to make tangible what’s been there all along; what Darwin, Humboldt, and so many other naturalists sought.

In the end, will it be intelligent machines that help us find the common thread uniting all living creatures, closing the loop and finally bringing us back to nature?


How We Get To Next was a magazine that explored the future of science, technology, and culture from 2014 to 2019. This article is part of our Talking With Bots section, which asks: What does it mean now that our technology is now smart enough to hold a conversation? Click the logo to read more.