Analysis of Various Facial Expressions of Horses as a Welfare Indicator Using Deep Learning
Authors: S. Kim, G. Cho
Journal: Veterinary Sciences
Summary
# Editorial Summary: Facial Expression Recognition for Equine Welfare Assessment Kim and Cho's 2023 research developed a deep learning model to automatically classify equine facial expressions across four distinct states—resting, pain, post-exercise, and during farriery—using images from 749 horses (586 healthy, 163 in pain). The system achieved strong overall accuracy in detecting facial landmarks (89.43% average across training, validation, and testing phases), with profile-view analysis outperforming frontal images (99.45% versus 97.59%). Importantly, whilst the model excelled at distinguishing resting and exercise-related expressions, pain classification accuracy was notably lower, suggesting that equine pain expressions may be more subtle, variable, or easily confused with other emotional or physical states than previously assumed. These findings indicate that single-moment pain assessment based on facial features alone risks both false positives and missed diagnoses, highlighting the need for practitioners to integrate expression analysis with other clinical indicators—lameness assessment, heart rate variability, and behavioural context—rather than relying on automated facial recognition as a standalone tool. The work nevertheless validates automated systems as a promising adjunct to subjective welfare monitoring, particularly for identifying comfort and stress states, and opens the door to longitudinal expression tracking that could reveal pain patterns invisible in static observations.
Read the full abstract on PubMed
Practical Takeaways
- •Facial expression analysis alone cannot reliably distinguish pain from other emotional states (exercise stress, tension, excitation) — use this tool as part of a broader pain assessment protocol, not as a standalone diagnostic
- •The lower accuracy for pain detection suggests horses show similar facial changes in different situations; combine automated facial recognition with clinical signs, behaviour observation, and gait analysis for accurate pain diagnosis
- •This technology has potential for objective welfare monitoring in facilities with multiple horses (yards, racing operations, competitions), but requires further validation before clinical implementation in individual cases
Key Findings
- •Deep learning model achieved 89.43% average accuracy in classifying equine facial expressions across four categories (resting, pain, post-exercise, horseshoeing)
- •Profile view analysis showed higher accuracy (99.45%) compared to frontal view (97.59%) for facial posture normalization
- •Pain classification accuracy was notably lower than overall classification accuracy, suggesting facial expressions vary by pain type, degree, and individual circumstances
- •Eyes-nose-ears detection model achieved 88.1% testing accuracy, demonstrating feasibility of automated equine facial recognition for welfare monitoring