Automated recognition of emotional states of horses from facial expressions.
Authors: Feighelstein Marcelo, Riccie-Bonot Claire, Hasan Hana, Weinberg Hallel, Rettig Tidhar, Segal Maya, Distelfeld Tomer, Shimshoni Ilan, Mills Daniel S, Zamansky Anna
Journal: PloS one
Summary
# Automated Recognition of Equine Emotional States from Facial Expressions Researchers have developed artificial intelligence models capable of identifying four distinct emotional states in horses—baseline, positive anticipation, disappointment, and frustration—by analysing facial expressions from video footage. Using deep learning pipelines trained on controlled experimental data, the team achieved 76% overall accuracy in distinguishing between these states, substantially outperforming machine learning models based on manual EquiFACS (Equine Facial Action Coding System) annotations. The system struggled to differentiate anticipation from frustration, achieving only 61% accuracy for this pairing, suggesting these emotions produce subtly similar facial signatures. For equine professionals, this work opens practical possibilities in welfare assessment and behaviour monitoring, though the relatively modest separation between certain emotional states indicates the technology requires further refinement before clinical deployment. The findings also highlight that automated video analysis may prove more reliable than manual coding systems for detecting equine emotional expression, potentially streamlining welfare evaluation in competition, rehabilitation, and management settings.
Read the full abstract on PubMed
Practical Takeaways
- •Automated facial recognition technology may provide objective methods to assess horse emotional states in practice, potentially improving welfare monitoring and training decisions.
- •Current AI models show promise but require further refinement, particularly for distinguishing between similar emotional states like anticipation versus frustration.
- •This technology could support farriers and handlers by providing real-time feedback on horse stress and emotional responses during procedures.
Key Findings
- •Deep learning AI models achieved 76% accuracy in automatically recognizing four distinct equine emotional states (baseline, positive anticipation, disappointment, frustration) from facial video footage.
- •Video-based deep learning pipeline outperformed machine learning approaches using EquiFACS annotations for emotion classification.
- •Positive anticipation and frustration states were difficult to distinguish, with only 61% classification accuracy between these two emotional states.