4.7 Article

Pose estimation-based lameness recognition in broiler using CNN-LSTM network

期刊

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.compag.2022.106931

关键词

Broiler; Lameness; Gait score; CNN; LSTM

资金

  1. AgResearch at the University of Tennessee [Sb-0000000035]
  2. Foundation for Food and Agri-culture Research

向作者/读者索取更多资源

In this study, a pose estimation-based model was developed to identify lameness in broilers for the first time. The model utilized a deep convolutional neural network and Long Short-Term Memory (LSTM) model to classify broilers based on video footages. The results showed that the model achieved high classification accuracy and could serve as an automatic and non-invasive tool for lameness assessment in poultry farms.
Poultry behavior is a critical indicator of its health and welfare. Lameness is a clinical symptom indicating the existence of health problems in poultry. Therefore, lameness detection in the early stages is vital to broiler producers. In this study, a pose estimation-based model was developed to identify lameness in broilers through analyzing video footages for the first time. A deep convolutional neural network was used to detect and track seven key points on the bodies of walking broilers. Then consecutive extracted key points were fed into Long Short-Term Memory (LSTM) model to classify broilers according to a 6-point assessment method. This paper proposes the first large-scale benchmark for broiler pose estimation, consisting of 9,412 images. In addition, the dataset includes 400 videos (36,120 frames in total) of broilers with different gait score levels. The developed LSTM model achieved an overall classification accuracy of 95%, and the average per class classification accuracy was 97.5%. The obtained results prove that the pose estimation-based model as an automatic and non-invasive tool of lameness assessment can be applied to poultry farms for efficient management.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据