AI model decodes animals' emotional states from their calls

Gaby Clark
scientific editor

Andrew Zinin
lead editor

How is an animal feeling at a given moment? Humans have long recognized certain well-known behavior like a cat hissing as a warning, but in many cases we've had little clue of what's going on inside an animal's head.
Now we have a better idea, thanks to a Milan-based researcher who has that he claims can detect whether their calls express positive or negative emotions. Stavros Ntalampiras's deep-learning model, which was published in Scientific Reports, can recognize emotional tones across seven species of hoofed animals, including pigs, goats and cows. The model picks up on shared features of their calls, such as pitch, frequency range and tonal quality.
The analysis showed that negative calls tended to be more mid to high frequency, while positive calls were spread more evenly across the spectrum. In pigs, high-pitched calls were especially informative, whereas in sheep and horses the mid-range carried more weight, a sign that animals share some common markers of emotion but also express them in ways that vary by species.
For scientists who have long tried to untangle animal signals, this discovery of emotional traits across species is the latest leap forward in a field that is being transformed by AI.
The implications are far-reaching. Farmers could receive earlier warnings of livestock stress, conservationists might monitor the emotional health of wild populations remotely, and zookeepers could respond more quickly to subtle welfare changes.
This potential for a new layer of insight into the animal world also raises ethical questions. If an algorithm can reliably detect when an animal is in distress, what responsibility do humans have to act? And how do we guard against over-generalization, where we assume that all signs of arousal mean the same thing in every species?
Of barks and buzzes
Tools like the one devised by Ntalampiras are not being trained to "translate" animals in a human sense, but to detect behavioral and acoustic patterns too subtle for us to perceive unaided.
Similar work is underway with whales, where New York-based research organization (the Cetacean Translation Initiative) is . Long believed to encode social meaning, these are now being mapped at scale using machine learning, revealing patterns that may correspond to each whale's identity, affiliation or emotional state.
In dogs, facial expressions, vocalizations and tail-wagging patterns with emotional states. that subtle shifts in canine facial muscles correspond to fear or excitement. varies depending on whether a dog encounters a familiar friend or a potential threat.
At Dublin City University's Insight Center for Data Analytics, we are developing a worn by assistance dogs which are trained to recognize the onset of a seizure in people who suffer from epilepsy. The collar uses sensors to pick up on a dog's trained behaviors, such as spinning, which raise the alarm that their owner is about to have a seizure.
, funded by Research Ireland, strives to demonstrate how AI can leverage animal communication to improve safety, support timely intervention, and enhance quality of life. In future we aim to train the model to recognize instinctive dog behaviors such as pawing, nudging or barking.
Honeybees, too, are under AI's lens. Their intricate —figure-of-eight movements that indicate food sources—are being decoded in real time with computer vision. These models highlight how small positional shifts influence how well other bees interpret the message.
Caveats
These systems promise real gains in animal welfare and safety. A collar that senses the first signs of stress in a working dog could spare it from exhaustion. A dairy herd monitored by vision-based AI might get treatment for illness hours or days sooner than a farmer would notice.
Detecting a cry of distress is not the same as understanding what it means, however. AI can show that two whale codas often occur together, or that a pig's squeal shares features with a goat's bleat. goes further by classifying such calls as broadly positive or negative, but even this remains using pattern recognition to try to decode emotions.
Emotional classifiers risk flattening rich behaviors into crude binaries of happy/sad or calm/stressed, such as logging a as "consent" when it can sometimes signal stress. in his study, pattern recognition is not the same as understanding.
One solution is for researchers to develop models that integrate vocal data with visual cues, such as posture or facial expression, and even physiological signals such as heart rate, to build more reliable indicators of how animals are feeling. AI models are also going to be most reliable when interpreted in context, alongside the knowledge of someone experienced with the species.
It's also worth bearing in mind that the ecological price of listening is high. Using AI adds carbon costs that, in fragile ecosystems, undercut the very conservation goals they claim to serve. It's therefore important that any technologies genuinely serve animal welfare, rather than simply satisfying human curiosity.
Whether we welcome it or not, AI is here. Machines are now decoding signals that evolution honed long before us, and will continue to get better at it.
The real test, though, is not how well we listen, but what we're prepared to do with what we hear. If we burn energy decoding animal signals but only use the information to exploit them, or manage them more tightly, it's not science that falls short—it's us.
More information: Stavros Ntalampiras, Species-independent analysis and identification of emotional animal vocalizations, Scientific Reports (2025).
Journal information: Scientific Reports
Provided by The Conversation
This article is republished from under a Creative Commons license. Read the .