1
|
Mluba HS, Atif O, Lee J, Park D, Chung Y. Pattern Mining-Based Pig Behavior Analysis for Health and Welfare Monitoring. SENSORS (BASEL, SWITZERLAND) 2024; 24:2185. [PMID: 38610396 PMCID: PMC11013991 DOI: 10.3390/s24072185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/02/2024] [Revised: 03/13/2024] [Accepted: 03/26/2024] [Indexed: 04/14/2024]
Abstract
The increasing popularity of pigs has prompted farmers to increase pig production to meet the growing demand. However, while the number of pigs is increasing, that of farm workers has been declining, making it challenging to perform various farm tasks, the most important among them being managing the pigs' health and welfare. This study proposes a pattern mining-based pig behavior analysis system to provide visualized information and behavioral patterns, assisting farmers in effectively monitoring and assessing pigs' health and welfare. The system consists of four modules: (1) data acquisition module for collecting pigs video; (2) detection and tracking module for localizing and uniquely identifying pigs, using tracking information to crop pig images; (3) pig behavior recognition module for recognizing pig behaviors from sequences of cropped images; and (4) pig behavior analysis module for providing visualized information and behavioral patterns to effectively help farmers understand and manage pigs. In the second module, we utilize ByteTrack, which comprises YOLOx as the detector and the BYTE algorithm as the tracker, while MnasNet and LSTM serve as appearance features and temporal information extractors in the third module. The experimental results show that the system achieved a multi-object tracking accuracy of 0.971 for tracking and an F1 score of 0.931 for behavior recognition, while also highlighting the effectiveness of visualization and pattern mining in helping farmers comprehend and manage pigs' health and welfare.
Collapse
Affiliation(s)
- Hassan Seif Mluba
- Department of Computer and Information Science, Korea University, Sejong City 30019, Republic of Korea; (H.S.M.); (O.A.)
| | - Othmane Atif
- Department of Computer and Information Science, Korea University, Sejong City 30019, Republic of Korea; (H.S.M.); (O.A.)
| | - Jonguk Lee
- Department of Computer Convergence Software, Sejong Campus, Korea University, Sejong City 30019, Republic of Korea;
| | - Daihee Park
- Department of Computer Convergence Software, Sejong Campus, Korea University, Sejong City 30019, Republic of Korea;
| | - Yongwha Chung
- Department of Computer Convergence Software, Sejong Campus, Korea University, Sejong City 30019, Republic of Korea;
| |
Collapse
|
2
|
Zhou H, Chung S, Kakar JK, Kim SC, Kim H. Pig Movement Estimation by Integrating Optical Flow with a Multi-Object Tracking Model. SENSORS (BASEL, SWITZERLAND) 2023; 23:9499. [PMID: 38067875 PMCID: PMC10708576 DOI: 10.3390/s23239499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 11/23/2023] [Accepted: 11/27/2023] [Indexed: 12/18/2023]
Abstract
Pig husbandry constitutes a significant segment within the broader framework of livestock farming, with porcine well-being emerging as a paramount concern due to its direct implications on pig breeding and production. An easily observable proxy for assessing the health of pigs lies in their daily patterns of movement. The daily movement patterns of pigs can be used as an indicator of their health, in which more active pigs are usually healthier than those who are not active, providing farmers with knowledge of identifying pigs' health state before they become sick or their condition becomes life-threatening. However, the conventional means of estimating pig mobility largely rely on manual observations by farmers, which is impractical in the context of contemporary centralized and extensive pig farming operations. In response to these challenges, multi-object tracking and pig behavior methods are adopted to monitor pig health and welfare closely. Regrettably, these existing methods frequently fall short of providing precise and quantified measurements of movement distance, thereby yielding a rudimentary metric for assessing pig health. This paper proposes a novel approach that integrates optical flow and a multi-object tracking algorithm to more accurately gauge pig movement based on both qualitative and quantitative analyses of the shortcomings of solely relying on tracking algorithms. The optical flow records accurate movement between two consecutive frames and the multi-object tracking algorithm offers individual tracks for each pig. By combining optical flow and the tracking algorithm, our approach can accurately estimate each pig's movement. Moreover, the incorporation of optical flow affords the capacity to discern partial movements, such as instances where only the pig's head is in motion while the remainder of its body remains stationary. The experimental results show that the proposed method has superiority over the method of solely using tracking results, i.e., bounding boxes. The reason is that the movement calculated based on bounding boxes is easily affected by the size fluctuation while the optical flow data can avoid these drawbacks and even provide more fine-grained motion information. The virtues inherent in the proposed method culminate in the provision of more accurate and comprehensive information, thus enhancing the efficacy of decision-making and management processes within the realm of pig farming.
Collapse
Affiliation(s)
- Heng Zhou
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Seyeon Chung
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Junaid Khan Kakar
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Sang Cheol Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Hyongsuk Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
- Department of Electronics Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
| |
Collapse
|
3
|
Wang S, Jiang H, Qiao Y, Jiang S. A Method for Obtaining 3D Point Cloud Data by Combining 2D Image Segmentation and Depth Information of Pigs. Animals (Basel) 2023; 13:2472. [PMID: 37570282 PMCID: PMC10417003 DOI: 10.3390/ani13152472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Revised: 07/21/2023] [Accepted: 07/25/2023] [Indexed: 08/13/2023] Open
Abstract
This paper proposes a method for automatic pig detection and segmentation using RGB-D data for precision livestock farming. The proposed method combines the enhanced YOLOv5s model with the Res2Net bottleneck structure, resulting in improved fine-grained feature extraction and ultimately enhancing the precision of pig detection and segmentation in 2D images. Additionally, the method facilitates the acquisition of 3D point cloud data of pigs in a simpler and more efficient way by using the pig mask obtained in 2D detection and segmentation and combining it with depth information. To evaluate the effectiveness of the proposed method, two datasets were constructed. The first dataset consists of 5400 images captured in various pig pens under diverse lighting conditions, while the second dataset was obtained from the UK. The experimental results demonstrated that the improved YOLOv5s_Res2Net achieved a mAP@0.5:0.95 of 89.6% and 84.8% for both pig detection and segmentation tasks on our dataset, while achieving a mAP@0.5:0.95 of 93.4% and 89.4% on the Edinburgh pig behaviour dataset. This approach provides valuable insights for improving pig management, conducting welfare assessments, and estimating weight accurately.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Yongliang Qiao
- Australian Institute for Machine Learning (AIML), The University of Adelaide, Adelaide, SA 5005, Australia
| | - Shuzhen Jiang
- Key Laboratory of Efficient Utilisation of Non-Grain Feed Resources (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Department of Animal Science and Technology, Shandong Agricultural University, Tai’an 271018, China;
| |
Collapse
|
4
|
Atif O, Lee J, Park D, Chung Y. Behavior-Based Video Summarization System for Dog Health and Welfare Monitoring. SENSORS (BASEL, SWITZERLAND) 2023; 23:2892. [PMID: 36991606 PMCID: PMC10054391 DOI: 10.3390/s23062892] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 03/02/2023] [Accepted: 03/04/2023] [Indexed: 06/19/2023]
Abstract
The popularity of dogs has been increasing owing to factors such as the physical and mental health benefits associated with raising them. While owners care about their dogs' health and welfare, it is difficult for them to assess these, and frequent veterinary checkups represent a growing financial burden. In this study, we propose a behavior-based video summarization and visualization system for monitoring a dog's behavioral patterns to help assess its health and welfare. The system proceeds in four modules: (1) a video data collection and preprocessing module; (2) an object detection-based module for retrieving image sequences where the dog is alone and cropping them to reduce background noise; (3) a dog behavior recognition module using two-stream EfficientNetV2 to extract appearance and motion features from the cropped images and their respective optical flow, followed by a long short-term memory (LSTM) model to recognize the dog's behaviors; and (4) a summarization and visualization module to provide effective visual summaries of the dog's location and behavior information to help assess and understand its health and welfare. The experimental results show that the system achieved an average F1 score of 0.955 for behavior recognition, with an execution time allowing real-time processing, while the summarization and visualization results demonstrate how the system can help owners assess and understand their dog's health and welfare.
Collapse
Affiliation(s)
- Othmane Atif
- Department of Computer and Information Science, Korea University, Sejong City 30019, Republic of Korea
| | - Jonguk Lee
- Department of Computer Convergence Software, Sejong Campus, Korea University, Sejong City 30019, Republic of Korea
| | - Daihee Park
- Department of Computer Convergence Software, Sejong Campus, Korea University, Sejong City 30019, Republic of Korea
| | - Yongwha Chung
- Department of Computer Convergence Software, Sejong Campus, Korea University, Sejong City 30019, Republic of Korea
| |
Collapse
|
5
|
Son S, Ahn H, Baek H, Yu S, Suh Y, Lee S, Chung Y, Park D. StaticPigDet: Accuracy Improvement of Static Camera-Based Pig Monitoring Using Background and Facility Information. SENSORS (BASEL, SWITZERLAND) 2022; 22:8315. [PMID: 36366013 PMCID: PMC9655159 DOI: 10.3390/s22218315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 10/23/2022] [Accepted: 10/27/2022] [Indexed: 06/16/2023]
Abstract
The automatic detection of individual pigs can improve the overall management of pig farms. The accuracy of single-image object detection has significantly improved over the years with advancements in deep learning techniques. However, differences in pig sizes and complex structures within pig pen of a commercial pig farm, such as feeding facilities, present challenges to the detection accuracy for pig monitoring. To implement such detection in practice, the differences should be analyzed by video recorded from a static camera. To accurately detect individual pigs that may be different in size or occluded by complex structures, we present a deep-learning-based object detection method utilizing generated background and facility information from image sequences (i.e., video) recorded from a static camera, which contain relevant information. As all images are preprocessed to reduce differences in pig sizes. We then used the extracted background and facility information to create different combinations of gray images. Finally, these images are combined into different combinations of three-channel composite images, which are used as training datasets to improve detection accuracy. Using the proposed method as a component of image processing improved overall accuracy from 84% to 94%. From the study, an accurate facility and background image was able to be generated after updating for a long time that helped detection accuracy. For the further studies, improving detection accuracy on overlapping pigs can also be considered.
Collapse
Affiliation(s)
- Seungwook Son
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Hanse Ahn
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Hwapyeong Baek
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Seunghyun Yu
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Yooil Suh
- Info Valley Korea Co., Ltd., Anyang 14067, Korea
| | - Sungju Lee
- Department of Software, Sangmyung University, Cheonan 31066, Korea
| | - Yongwha Chung
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Daihee Park
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| |
Collapse
|
6
|
Wang S, Jiang H, Qiao Y, Jiang S, Lin H, Sun Q. The Research Progress of Vision-Based Artificial Intelligence in Smart Pig Farming. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22176541. [PMID: 36080994 PMCID: PMC9460267 DOI: 10.3390/s22176541] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Revised: 08/22/2022] [Accepted: 08/27/2022] [Indexed: 05/05/2023]
Abstract
Pork accounts for an important proportion of livestock products. For pig farming, a lot of manpower, material resources and time are required to monitor pig health and welfare. As the number of pigs in farming increases, the continued use of traditional monitoring methods may cause stress and harm to pigs and farmers and affect pig health and welfare as well as farming economic output. In addition, the application of artificial intelligence has become a core part of smart pig farming. The precision pig farming system uses sensors such as cameras and radio frequency identification to monitor biometric information such as pig sound and pig behavior in real-time and convert them into key indicators of pig health and welfare. By analyzing the key indicators, problems in pig health and welfare can be detected early, and timely intervention and treatment can be provided, which helps to improve the production and economic efficiency of pig farming. This paper studies more than 150 papers on precision pig farming and summarizes and evaluates the application of artificial intelligence technologies to pig detection, tracking, behavior recognition and sound recognition. Finally, we summarize and discuss the opportunities and challenges of precision pig farming.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Yongliang Qiao
- Australian Centre for Field Robotics (ACFR), Faculty of Engineering, The University of Sydney, Sydney, NSW 2006, Australia
- Correspondence:
| | - Shuzhen Jiang
- College of Animal Science and Veterinary Medicine, Shandong Agricultural University, Tai’an 271018, China
| | - Huaiqin Lin
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Qian Sun
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| |
Collapse
|