1
|
Teterja D, Garcia-Rodriguez J, Azorin-Lopez J, Sebastian-Gonzalez E, Nedić D, Leković D, Knežević P, Drajić D, Vukobratović D. A Video Mosaicing-Based Sensing Method for Chicken Behavior Recognition on Edge Computing Devices. SENSORS (BASEL, SWITZERLAND) 2024; 24:3409. [PMID: 38894200 PMCID: PMC11174875 DOI: 10.3390/s24113409] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/02/2024] [Revised: 05/22/2024] [Accepted: 05/24/2024] [Indexed: 06/21/2024]
Abstract
Chicken behavior recognition is crucial for a number of reasons, including promoting animal welfare, ensuring the early detection of health issues, optimizing farm management practices, and contributing to more sustainable and ethical poultry farming. In this paper, we introduce a technique for recognizing chicken behavior on edge computing devices based on video sensing mosaicing. Our method combines video sensing mosaicing with deep learning to accurately identify specific chicken behaviors from videos. It attains remarkable accuracy, achieving 79.61% with MobileNetV2 for chickens demonstrating three types of behavior. These findings underscore the efficacy and promise of our approach in chicken behavior recognition on edge computing devices, making it adaptable for diverse applications. The ongoing exploration and identification of various behavioral patterns will contribute to a more comprehensive understanding of chicken behavior, enhancing the scope and accuracy of behavior analysis within diverse contexts.
Collapse
Affiliation(s)
- Dmitrij Teterja
- Department of Computer Science and Technology, University of Alicante, 03690 San Vicente del Raspeig, Alicante, Spain;
| | - Jose Garcia-Rodriguez
- Department of Computer Science and Technology, University of Alicante, 03690 San Vicente del Raspeig, Alicante, Spain;
| | - Jorge Azorin-Lopez
- Department of Computer Science and Technology, University of Alicante, 03690 San Vicente del Raspeig, Alicante, Spain;
| | | | - Daliborka Nedić
- DunavNet DOO, Bulevar Oslobođenja 133/2, 21000 Novi Sad, Serbia; (D.N.); (D.L.); (P.K.); (D.D.)
| | - Dalibor Leković
- DunavNet DOO, Bulevar Oslobođenja 133/2, 21000 Novi Sad, Serbia; (D.N.); (D.L.); (P.K.); (D.D.)
| | - Petar Knežević
- DunavNet DOO, Bulevar Oslobođenja 133/2, 21000 Novi Sad, Serbia; (D.N.); (D.L.); (P.K.); (D.D.)
| | - Dejan Drajić
- DunavNet DOO, Bulevar Oslobođenja 133/2, 21000 Novi Sad, Serbia; (D.N.); (D.L.); (P.K.); (D.D.)
- Paviljon Računskog Centra, The Department of Telecommunications, School of Electrical Engineering, University of Belgrade, Bulevar kralja Aleksandra 73, 11120 Belgrade, Serbia
| | - Dejan Vukobratović
- Faculty of Technical Sciences, University of Novi Sad, Trg Dositeja Obradovića 6, 21000 Novi Sad, Serbia;
| |
Collapse
|
2
|
Guo Y, Regmi P, Ding Y, Bist RB, Chai L. Automatic detection of brown hens in cage-free houses with deep learning methods. Poult Sci 2023; 102:102784. [PMID: 37302327 PMCID: PMC10276268 DOI: 10.1016/j.psj.2023.102784] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2023] [Revised: 05/06/2023] [Accepted: 05/13/2023] [Indexed: 06/13/2023] Open
Abstract
Computer vision technologies have been tested to monitor animals' behaviors and performance. High stocking density and small body size of chickens such as broiler and cage-free layers make effective automated monitoring quite challenging. Therefore, it is critical to improve the accuracy and robustness of laying hens clustering detection. In this study, we established a laying hens detection model YOLOv5-C3CBAM-BiFPN, and tested its performance in detecting birds on open litter. The model consists of 3 parts: 1) the basic YOLOv5 model for feature extraction and target detection of laying hens; 2) the convolution block attention module integrated with C3 module (C3CBAM) to improve the detection effect of targets and occluded targets; and 3) bidirectional feature pyramid network (BiFPN), which is used to enhance the transmission of feature information between different network layers and improve the accuracy of the algorithm. In order to better evaluate the effectiveness of the new model, a total of 720 images containing different numbers of laying hens were selected to construct complex datasets with different occlusion degrees and densities. In addition, this paper also compared the proposed model with a YOLOv5 model that combined other attention mechanisms. The test results show that the improved model YOLOv5-C3CBAM-BiFPN achieved a precision of 98.2%, a recall of 92.9%, a mAP (IoU = 0.5) of 96.7%, a classification rate 156.3 f/s (frames per second), and a F1 (F1 score) of 95.4%. In other words, the laying hen detection method based on deep learning proposed in the present study has excellent performance, can identify the target accurately and quickly, and can be applied to real-time detection of laying hens in real-world production environment.
Collapse
Affiliation(s)
- Yangyang Guo
- School of Internet, Anhui University, Hefei, Anhui 230039, China; Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Prafulla Regmi
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Yi Ding
- School of Internet, Anhui University, Hefei, Anhui 230039, China
| | | | - Lilong Chai
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA.
| |
Collapse
|
3
|
Bist RB, Yang X, Subedi S, Chai L. Mislaying behavior detection in cage-free hens with deep learning technologies. Poult Sci 2023; 102:102729. [PMID: 37192567 DOI: 10.1016/j.psj.2023.102729] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2023] [Revised: 04/01/2023] [Accepted: 04/13/2023] [Indexed: 05/18/2023] Open
Abstract
Floor egg-laying behavior (FELB) is one of the most concerning issues in commercial cage-free (CF) houses because floor eggs (i.e., mislaid eggs on the floor) result in high labor costs and food safety concerns. Farms with poor management may have up to 10% of daily floor eggs. Therefore, it is critical to improving floor eggs management. Detecting FELB timely and identifying the reason behind its cause may address the issue. The primary objectives of this research were to develop and test a new deep-learning model to detect FELB and evaluate the model's performance in 4 identical research CF houses (200 Hy-Line W-36 hens per house), where perches and litter floor were provided to mimic commercial tiered aviary system. Five different YOLOv5 models (i.e., YOLOv5n, YOLOv5s, YOLOv5m, YOLOv5l, and YOLOv5x) were trained and compared. According to a dataset of 5400 images (i.e., 3780 for training, 1080 for validation, and 540 for testing), YOLOv5m-FELB and YOLOv5x-FELB models were tested with higher precision (99.9%), recall (99.2%), mAP@0.50 (99.6%), and F1-score (99.6%) than others. However, the YOLOv5m-NFELB model has lower recall than other YOLOv5-NFELB models, although it was tested with higher precision. Similarly, the speed of data processing (4%-45% FPS), and training time (3%-148%) were higher in the YOLOv5s model while requiring less GPU (1.8-4.8 times) than in other models. Furthermore, the camera height of 0.5 m and clean camera outperform compared to 3 m height and dusty camera. Thus, the newly developed and trained YOLOv5s model will be further innovated. Future studies will be conducted to verify the performance of the model in commercial CF houses to detect FELB.
Collapse
Affiliation(s)
- Ramesh Bahadur Bist
- Department of Poultry Science, College of Agricultural & Environmental Sciences, University of Georgia, Athens, GA 30602, USA
| | - Xiao Yang
- Department of Poultry Science, College of Agricultural & Environmental Sciences, University of Georgia, Athens, GA 30602, USA
| | - Sachin Subedi
- Department of Poultry Science, College of Agricultural & Environmental Sciences, University of Georgia, Athens, GA 30602, USA
| | - Lilong Chai
- Department of Poultry Science, College of Agricultural & Environmental Sciences, University of Georgia, Athens, GA 30602, USA.
| |
Collapse
|
4
|
Feather Damage Monitoring System Using RGB-Depth-Thermal Model for Chickens. Animals (Basel) 2022; 13:ani13010126. [PMID: 36611735 PMCID: PMC9817991 DOI: 10.3390/ani13010126] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Revised: 12/24/2022] [Accepted: 12/26/2022] [Indexed: 12/30/2022] Open
Abstract
Feather damage is a continuous health and welfare challenge among laying hens. Infrared thermography is a tool that can evaluate the changes in the surface temperature, derived from an inflammatory process that would make it possible to objectively determine the depth of the damage to the dermis. Therefore, the objective of this article was to develop an approach to feather damage assessment based on visible light and infrared thermography. Fusing information obtained from these two bands can highlight their strengths, which is more evident in the assessment of feather damage. A novel pipeline was proposed to reconstruct the RGB-Depth-Thermal maps of the chicken using binocular color cameras and a thermal infrared camera. The process of stereo matching based on binocular color images allowed for a depth image to be obtained. Then, a heterogeneous image registration method was presented to achieve image alignment between thermal infrared and color images so that the thermal infrared image was also aligned with the depth image. The chicken image was segmented from the background using a deep learning-based network based on the color and depth images. Four kinds of images, namely, color, depth, thermal and mask, were utilized as inputs to reconstruct the 3D model of a chicken with RGB-Depth-Thermal maps. The depth of feather damage can be better assessed with the proposed model compared to the 2D thermal infrared image or color image during both day and night, which provided a reference for further research in poultry farming.
Collapse
|
5
|
Monitoring Behaviors of Broiler Chickens at Different Ages with Deep Learning. Animals (Basel) 2022; 12:ani12233390. [PMID: 36496910 PMCID: PMC9736866 DOI: 10.3390/ani12233390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2022] [Revised: 11/27/2022] [Accepted: 11/30/2022] [Indexed: 12/03/2022] Open
Abstract
Animal behavior monitoring allows the gathering of animal health information and living habits and is an important technical means in precision animal farming. To quickly and accurately identify the behavior of broilers at different days, we adopted different deep learning behavior recognition models. Firstly, the top-view images of broilers at 2, 9, 16 and 23 days were obtained. In each stage, 300 images of each of the four broilers behaviors (i.e., feeding, drinking, standing, and resting) were segmented, totaling 4800 images. After image augmentation processing, 10,200 images were generated for each day including 8000 training sets, 2000 validation sets, and 200 testing sets. Finally, the performance of different convolutional neural network models (CNN) in broiler behavior recognition at different days was analyzed. The results show that the overall performance of the DenseNet-264 network was the best, with the accuracy rates of 88.5%, 97%, 94.5%, and 90% when birds were 2, 9, 16 and 23 days old, respectively. In addition, the efficient channel attention was introduced into the DenseNet-264 network (ECA-DenseNet-264), and the results (accuracy rates: 85%, 95%, 92%, 89.5%) confirmed that the DenseNet-264 network was still the best overall. The research results demonstrate that it is feasible to apply deep learning technology to monitor the behavior of broilers at different days.
Collapse
|
6
|
Castro F, Chai L, Arango J, Owens C, Smith P, Reichelt S, DuBois C, Menconi A. Poultry industry paradigms: connecting the dots. J APPL POULTRY RES 2022. [DOI: 10.1016/j.japr.2022.100310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
|