Algorithm successfully identifies pain in horses through video input

Share
The software for the labeling process when programming the computer to automatically detect signs of pain in horses. Image: Carreira Lencioni et al. https://doi.org/10.1371/journal.pone.0258672
The software for the labeling process when programming the computer to automatically detect signs of pain in horses. Image: Carreira Lencioni et al. https://doi.org/10.1371/journal.pone.0258672

A computer has been successfully programmed in Brazil to automatically assess the pain level in horses based on video input.

Researchers with the University of São Paulo developed and evaluated a machine vision algorithm to assess pain levels in horses.

The study team employed an automatic computational classifier based on the Horse Grimace Scale which was trained through a machine learning method.

The use of the Horse Grimace Scale, first described in 2014, is traditionally dependent on a trained human observer, who may not necessarily have the time availability to evaluate the animal for long periods.

Even with adequate training, the presence of an unknown person near an animal in pain can result in behavioral changes, making the evaluation more complex, Gabriel Carreira Lencioni and his fellow researchers noted in the journal PLOS ONE.

The study team turned to an automatic video-imaging system as a possible solution, with the potential to monitor pain responses in horses more accurately and in real-time. This in turn could allow an earlier diagnosis and more efficient treatment for animals in discomfort.

Their study was based on the assessment of facial expressions of seven horses who were gelded. Video footage was collected via a camera positioned on the top of their feeder station, capturing images at four distinct time-points daily for two days before and four days after their surgery

A labeling process was applied to build a pain facial image database. Machine learning methods were then employed to train the computational pain classifier.

The model used to evaluate pain level according to the position of the ears was composed of 2379 images; the model for the eyes comprised 1436 images; and modeling for the mouth and nostrils comprised 1035 images.

The machine vision algorithm was developed through the training of a Convolutional Neural Network (CNN) that resulted in an overall accuracy of 75.8% while classifying pain on three levels: Not present, moderately present, and obviously present.

When classifying between just two categories — pain not present and pain present — the overall accuracy reached 88.3%.

“These results show that evaluating pain automatically in horses through the use of artificial intelligence to recognize facial expressions is very promising,” they said.

“However, some improvements should be made in the future in order to produce even better outcomes, such as having a larger image bank with higher quality images in order to improve the training of the Convolutional Neural Networks, and also for a better balance between each one of the classes, resulting in an equivalent number of images for each class.

“Additionally, it would be interesting to have more trained evaluators classifying the images in order to exclude possible biases from individual discrepancies when applying the pain scales.”

The study team comprised Carreira Lencioni, Rafael Vieira de Sousa, Edson José de Souza Sardinha and Rodrigo Romero Corrêa, all with the University of São Paul.

Lencioni GC, de Sousa RV, de Souza Sardinha EJ, Corrêa RR, Zanella AJ (2021) Pain assessment in horses using automatic facial expression recognition through deep learning-based modeling. PLoS ONE 16(10): e0258672. https://doi.org/10.1371/journal.pone.0258672

The study, published under a Creative Commons License, can be read here

Horsetalk.co.nz

Latest research and information from the horse world.

Leave a Reply

Your email address will not be published. Required fields are marked *