Horses have an extensive repertoire of complex facial movements, many of them similar to humans, researchers have found.
“What we’ll now be looking at is how these expressions relate to emotional states,” said British-based University of Sussex doctoral researcher Jennifer Wathan, a co-lead author in the study.
“What surprised us was the rich repertoire of complex facial movements in horses, and how many of them are similar to humans.
“Despite the differences in face structure between horses and humans, we were able to identify some similar expressions in relation to movements of the lips and eyes,” she said.
Some facial expressions were surprisingly similar to those of humans and chimpanzees, according to the researchers, whose findings have been published in the peer-reviewed open-access journal, PLOS ONE.
Mammal communication researchers have shown that, like humans, horses use muscles underlying various facial features, including their nostrils, lips and eyes, to alter their facial expressions in a variety of social situations. The findings suggest evolutionary parallels in different species in how the face is used for communication.
This latest study builds on previous research showing that cues from the face are important for horses to communicate.
The researchers developed an objective coding system to identify different individual facial expressions on the basis of underlying muscle movement.
Their Equine Facial Action Coding System, which they have named EquiFACS, identified 17 “Action Units” – or discrete facial movements- in horses. This compared with 27 in humans, 13 in chimpanzees and 16 in dogs.
“Horses are predominantly visual animals, with eyesight that’s better than domestic cats and dogs, yet their use of facial expressions has been largely overlooked,” explained Wathan.
The researchers analysed video footage of a wide range of naturally occurring horse behaviours to identify all the different movements it was possible for horses to make with their face.
They also carried out an anatomical investigation of the facial muscles that underpinned these movements. Each individual facial movement that was identified was given a code.
Co-lead author Professor Karen McComb, also from the University of Sussex, said: “It was previously thought that, in terms of other species, the further away an animal was from humans, the more rudimentary their use of facial expressions would be.
“Through the development of EquiFACS, however, it’s apparent that horses, with their complex and fluid social systems, also have an extensive range of facial movements and share many of these with humans and other animals.
“This contributes to a growing body of evidence suggesting that social factors have had a significant influence on the evolution of facial expression.”
She said that a systematic way of recording facial expressions would have a wide range of uses.
“With EquiFACS we can now document the facial movements associated with different social and emotional contexts and thus gain insights into how horses are actually experiencing their social world.
“As well as enhancing our understanding of social cognition and comparative psychology, the findings should ultimately provide important information for veterinary and animal welfare practices.”
Wathan and McComb, joined in the study by Anne Burrows, from England’s University of Portsmouth and Bridget Waller, from Duquesne University in Pennsylvania, described EquiFACS as a reliable system that people could be trained to use.
The system described facial actions in a standardised way, avoiding subjective assessments of expression.
“This is particularly important as the recording and analysis of facial expressions can be subject to a large degree of observer bias and influenced by the perceived emotional context,” they wrote.
Horses, they said, had provided an interesting model for the study.
“Horses are long-lived social animals,” they said. “Feral populations have demonstrated that without domestic pressures horses would live in a society comprising of several small groups or ‘bands’ that share space and resources, and to which membership stays relatively stable over time.
“Bands have large, overlapping ranges so horses regularly come into contact with many other conspecifics, and inter-band dominance indicates that within the larger herd established social relationships exist.
“Consequently, horses show fission-fusion dynamics; a variation of the same complex social organisation that is seen in humans, bonobos, chimpanzees, and macaques, as well as elephants, spotted hyenas and many marine mammals.
“Group life in these societies is determined by complex, long-term social relationships that must be maintained, suggesting effective communication would be adaptive.”
Horses, they noted, were predominantly visual animals, with reasonable visual acuity that was better than domestic cats and dogs.
“While horses’ use of head and body posture in signaling has been described in observational literature, surprisingly, their use of facial expressions has been largely overlooked,” the research team said. This was despite reports that horses routinely used some apparently complex facial expressions.
The original Facial Action Coding System (FACS) was developed for use in humans. To modify it for use in non-human animals, the first step was to document and compare the facial anatomy of horses. They noted that the muscles around the ears, lips, and nose of the horse were particularly large and complex.
They then collected and analysed 15 hours of video footage of a wide range of naturally occurring horse behaviours captured on high quality video. In all, 86 horses of differing ages and breeds were involved.
The interactions analysed included those with other horses, humans, and dogs. They also included feeding and mating.
Discrete facial movements were identified and their proposed muscular basis was noted. They were then coded into the FACS systems.
They went on to describe the 17 discrete facial movements – or Action Units – which included raising the inner brow, eyelid actions, a range of lip, ear and nostril movements, and other actions involving the mouth, chin and jaw.
They said the 17 Action Units identified, while fewer than the 27 found in humans, was still slightly more than most other animals for which a Facial Action Coding System had been developed. These systems had identified 13 Action Units in chimpanzees, 13 in rhesus macaques, 15 in orang-utans, 16 in hylobatids, and 16 in dogs, with only cats displaying a larger facial repertoire at 21 – largely due to the extensive whisker and ear movements.
“It was previously thought that humans possessed the most complex repertoire of facial expressions and that, in [evolutionary] terms, the further away an animal was from humans, the more rudimentary their use of facial expressions would be,” the researchers said.
“However, through the development of EquiFACS it is apparent that horses also have an extensive range of facial movements, sharing many Action Units with humans and other animals.”
The study team said their research contributed to a growing body of evidence suggesting that the evolution of facial expressions was not driven entirely by evolutionary pressures, but that other, socio-ecological factors had a significant influence.
They continued: “While recent work has suggested that horses use apparently complex facial expressions, and that certain facial movements are associated with pain in domestic horses, until now the full capacity of horse facial expressions to convey a range of information has been largely overlooked.
“In particular, no studies have yet investigated whether there are facial expressions associated with positive experiences in horses–a critical yet poorly understood aspect of animal welfare.”
They suggested the system they developed could be applied to address this gap, with the potential to greatly facilitate future studies of horse welfare as well as extending our knowledge of equine communication and cognition.
Wathan J, Burrows AM, Waller BM, McComb K (2015) EquiFACS: The Equine Facial Action Coding System. PLoS ONE 10(8): e0131738. doi:10.1371/journal.pone.0131738
The full study can be read here.