1
|
Zhou H, Chung S, Kakar JK, Kim SC, Kim H. Pig Movement Estimation by Integrating Optical Flow with a Multi-Object Tracking Model. SENSORS (BASEL, SWITZERLAND) 2023; 23:9499. [PMID: 38067875 PMCID: PMC10708576 DOI: 10.3390/s23239499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 11/23/2023] [Accepted: 11/27/2023] [Indexed: 12/18/2023]
Abstract
Pig husbandry constitutes a significant segment within the broader framework of livestock farming, with porcine well-being emerging as a paramount concern due to its direct implications on pig breeding and production. An easily observable proxy for assessing the health of pigs lies in their daily patterns of movement. The daily movement patterns of pigs can be used as an indicator of their health, in which more active pigs are usually healthier than those who are not active, providing farmers with knowledge of identifying pigs' health state before they become sick or their condition becomes life-threatening. However, the conventional means of estimating pig mobility largely rely on manual observations by farmers, which is impractical in the context of contemporary centralized and extensive pig farming operations. In response to these challenges, multi-object tracking and pig behavior methods are adopted to monitor pig health and welfare closely. Regrettably, these existing methods frequently fall short of providing precise and quantified measurements of movement distance, thereby yielding a rudimentary metric for assessing pig health. This paper proposes a novel approach that integrates optical flow and a multi-object tracking algorithm to more accurately gauge pig movement based on both qualitative and quantitative analyses of the shortcomings of solely relying on tracking algorithms. The optical flow records accurate movement between two consecutive frames and the multi-object tracking algorithm offers individual tracks for each pig. By combining optical flow and the tracking algorithm, our approach can accurately estimate each pig's movement. Moreover, the incorporation of optical flow affords the capacity to discern partial movements, such as instances where only the pig's head is in motion while the remainder of its body remains stationary. The experimental results show that the proposed method has superiority over the method of solely using tracking results, i.e., bounding boxes. The reason is that the movement calculated based on bounding boxes is easily affected by the size fluctuation while the optical flow data can avoid these drawbacks and even provide more fine-grained motion information. The virtues inherent in the proposed method culminate in the provision of more accurate and comprehensive information, thus enhancing the efficacy of decision-making and management processes within the realm of pig farming.
Collapse
Affiliation(s)
- Heng Zhou
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Seyeon Chung
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Junaid Khan Kakar
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Sang Cheol Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Hyongsuk Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
- Department of Electronics Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
| |
Collapse
|
2
|
Wei J, Tang X, Liu J, Zhang Z. Detection of Pig Movement and Aggression Using Deep Learning Approaches. Animals (Basel) 2023; 13:3074. [PMID: 37835680 PMCID: PMC10571548 DOI: 10.3390/ani13193074] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2023] [Revised: 09/25/2023] [Accepted: 09/29/2023] [Indexed: 10/15/2023] Open
Abstract
Motion and aggressive behaviors in pigs provide important information for the study of social hierarchies in pigs and can be used as a selection indicator for pig health and aggression parameters. However, relying only on visual observation or surveillance video to record the number of aggressive acts is time-consuming, labor-intensive, and lasts for only a short period of time. Manual observation is too short compared to the growth cycle of pigs, and complete recording is impractical in large farms. In addition, due to the complex process of assessing the intensity of pig aggression, manual recording is highly influenced by human subjective vision. In order to efficiently record pig motion and aggressive behaviors as parameters for breeding selection and behavioral studies, the videos and pictures were collected from typical commercial farms, with each unit including 8~20 pigs in 7~25 m2 space; they were bred in stable social groups and a video was set up to record the whole day's activities. We proposed a deep learning-based recognition method for detecting and recognizing the movement and aggressive behaviors of pigs by recording and annotating head-to-head tapping, head-to-body tapping, neck biting, body biting, and ear biting during fighting. The method uses an improved EMA-YOLOv8 model and a target tracking algorithm to assign a unique digital identity code to each pig, while efficiently recognizing and recording pig motion and aggressive behaviors and tracking them, thus providing statistics on the speed and duration of pig motion. On the test dataset, the average precision of the model was 96.4%, indicating that the model has high accuracy in detecting a pig's identity and its fighting behaviors. The model detection results were highly correlated with the manual recording results (R2 of 0.9804 and 0.9856, respectively), indicating that the method has high accuracy and effectiveness. In summary, the method realized the detection and identification of motion duration and aggressive behavior of pigs under natural conditions, and provided reliable data and technical support for the study of the social hierarchy of pigs and the selection of pig health and aggression phenotypes.
Collapse
Affiliation(s)
| | | | | | - Zhiyan Zhang
- State Key Laboratory for Pig Genetic Improvement and Production Technology, Jiangxi Agricultural University, Nanchang 330045, China
| |
Collapse
|
3
|
Tagarakis AC, Bochtis D. Sensors and Robotics for Digital Agriculture. SENSORS (BASEL, SWITZERLAND) 2023; 23:7255. [PMID: 37631794 PMCID: PMC10457808 DOI: 10.3390/s23167255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Accepted: 08/17/2023] [Indexed: 08/27/2023]
Abstract
The latest advances in innovative sensing and data technologies have led to an increasing implementation of autonomous systems in agricultural production processes [...].
Collapse
Affiliation(s)
- Aristotelis C. Tagarakis
- Institute for Bio-Economy and Agri-Technology (iBO), Centre for Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, Greece;
| | | |
Collapse
|
4
|
Wang S, Jiang H, Qiao Y, Jiang S. A Method for Obtaining 3D Point Cloud Data by Combining 2D Image Segmentation and Depth Information of Pigs. Animals (Basel) 2023; 13:2472. [PMID: 37570282 PMCID: PMC10417003 DOI: 10.3390/ani13152472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Revised: 07/21/2023] [Accepted: 07/25/2023] [Indexed: 08/13/2023] Open
Abstract
This paper proposes a method for automatic pig detection and segmentation using RGB-D data for precision livestock farming. The proposed method combines the enhanced YOLOv5s model with the Res2Net bottleneck structure, resulting in improved fine-grained feature extraction and ultimately enhancing the precision of pig detection and segmentation in 2D images. Additionally, the method facilitates the acquisition of 3D point cloud data of pigs in a simpler and more efficient way by using the pig mask obtained in 2D detection and segmentation and combining it with depth information. To evaluate the effectiveness of the proposed method, two datasets were constructed. The first dataset consists of 5400 images captured in various pig pens under diverse lighting conditions, while the second dataset was obtained from the UK. The experimental results demonstrated that the improved YOLOv5s_Res2Net achieved a mAP@0.5:0.95 of 89.6% and 84.8% for both pig detection and segmentation tasks on our dataset, while achieving a mAP@0.5:0.95 of 93.4% and 89.4% on the Edinburgh pig behaviour dataset. This approach provides valuable insights for improving pig management, conducting welfare assessments, and estimating weight accurately.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Yongliang Qiao
- Australian Institute for Machine Learning (AIML), The University of Adelaide, Adelaide, SA 5005, Australia
| | - Shuzhen Jiang
- Key Laboratory of Efficient Utilisation of Non-Grain Feed Resources (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Department of Animal Science and Technology, Shandong Agricultural University, Tai’an 271018, China;
| |
Collapse
|
5
|
Li G, Shi G, Jiao J. YOLOv5-KCB: A New Method for Individual Pig Detection Using Optimized K-Means, CA Attention Mechanism and a Bi-Directional Feature Pyramid Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23115242. [PMID: 37299967 DOI: 10.3390/s23115242] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/29/2023] [Revised: 05/26/2023] [Accepted: 05/29/2023] [Indexed: 06/12/2023]
Abstract
Individual identification of pigs is a critical component of intelligent pig farming. Traditional pig ear-tagging requires significant human resources and suffers from issues such as difficulty in recognition and low accuracy. This paper proposes the YOLOv5-KCB algorithm for non-invasive identification of individual pigs. Specifically, the algorithm utilizes two datasets-pig faces and pig necks-which are divided into nine categories. Following data augmentation, the total sample size was augmented to 19,680. The distance metric used for K-means clustering is changed from the original algorithm to 1-IOU, which improves the adaptability of the model's target anchor boxes. Furthermore, the algorithm introduces SE, CBAM, and CA attention mechanisms, with the CA attention mechanism being selected for its superior performance in feature extraction. Finally, CARAFE, ASFF, and BiFPN are used for feature fusion, with BiFPN selected for its superior performance in improving the detection ability of the algorithm. The experimental results indicate that the YOLOv5-KCB algorithm achieved the highest accuracy rates in pig individual recognition, surpassing all other improved algorithms in average accuracy rate (IOU = 0.5). The accuracy rate of pig head and neck recognition was 98.4%, while the accuracy rate for pig face recognition was 95.1%, representing an improvement of 4.8% and 13.8% over the original YOLOv5 algorithm. Notably, the average accuracy rate of identifying pig head and neck was consistently higher than pig face recognition across all algorithms, with YOLOv5-KCB demonstrating an impressive 2.9% improvement. These results emphasize the potential for utilizing the YOLOv5-KCB algorithm for precise individual pig identification, facilitating subsequent intelligent management practices.
Collapse
Affiliation(s)
- Guangbo Li
- School of Information and Computer, Anhui Agricultural University, Hefei 230036, China
| | - Guolong Shi
- School of Information and Computer, Anhui Agricultural University, Hefei 230036, China
- Key Laboratory of Agricultural Sensors, Ministry of Agriculture and Rural Affairs, Hefei 230036, China
- Anhui Provincial Key Laboratory of Smart Agricultural Technology and Equipment, Hefei 230036, China
| | - Jun Jiao
- School of Information and Computer, Anhui Agricultural University, Hefei 230036, China
| |
Collapse
|
6
|
Hao W, Zhang K, Zhang L, Han M, Hao W, Li F, Yang G. TSML: A New Pig Behavior Recognition Method Based on Two-Stream Mutual Learning Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23115092. [PMID: 37299818 DOI: 10.3390/s23115092] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 05/18/2023] [Accepted: 05/22/2023] [Indexed: 06/12/2023]
Abstract
Changes in pig behavior are crucial information in the livestock breeding process, and automatic pig behavior recognition is a vital method for improving pig welfare. However, most methods for pig behavior recognition rely on human observation and deep learning. Human observation is often time-consuming and labor-intensive, while deep learning models with a large number of parameters can result in slow training times and low efficiency. To address these issues, this paper proposes a novel deep mutual learning enhanced two-stream pig behavior recognition approach. The proposed model consists of two mutual learning networks, which include the red-green-blue color model (RGB) and flow streams. Additionally, each branch contains two student networks that learn collaboratively to effectively achieve robust and rich appearance or motion features, ultimately leading to improved recognition performance of pig behaviors. Finally, the results of RGB and flow branches are weighted and fused to further improve the performance of pig behavior recognition. Experimental results demonstrate the effectiveness of the proposed model, which achieves state-of-the-art recognition performance with an accuracy of 96.52%, surpassing other models by 2.71%.
Collapse
Affiliation(s)
- Wangli Hao
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Kai Zhang
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Li Zhang
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Meng Han
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Wangbao Hao
- Yuncheng National Jinnan Cattle Genetic Resources and Gene Protection Center, Yongji 044099, China
| | - Fuzhong Li
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Guoqiang Yang
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| |
Collapse
|
7
|
Wang S, Jiang H, Qiao Y, Jiang S, Lin H, Sun Q. The Research Progress of Vision-Based Artificial Intelligence in Smart Pig Farming. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22176541. [PMID: 36080994 PMCID: PMC9460267 DOI: 10.3390/s22176541] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Revised: 08/22/2022] [Accepted: 08/27/2022] [Indexed: 05/05/2023]
Abstract
Pork accounts for an important proportion of livestock products. For pig farming, a lot of manpower, material resources and time are required to monitor pig health and welfare. As the number of pigs in farming increases, the continued use of traditional monitoring methods may cause stress and harm to pigs and farmers and affect pig health and welfare as well as farming economic output. In addition, the application of artificial intelligence has become a core part of smart pig farming. The precision pig farming system uses sensors such as cameras and radio frequency identification to monitor biometric information such as pig sound and pig behavior in real-time and convert them into key indicators of pig health and welfare. By analyzing the key indicators, problems in pig health and welfare can be detected early, and timely intervention and treatment can be provided, which helps to improve the production and economic efficiency of pig farming. This paper studies more than 150 papers on precision pig farming and summarizes and evaluates the application of artificial intelligence technologies to pig detection, tracking, behavior recognition and sound recognition. Finally, we summarize and discuss the opportunities and challenges of precision pig farming.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Yongliang Qiao
- Australian Centre for Field Robotics (ACFR), Faculty of Engineering, The University of Sydney, Sydney, NSW 2006, Australia
- Correspondence:
| | - Shuzhen Jiang
- College of Animal Science and Veterinary Medicine, Shandong Agricultural University, Tai’an 271018, China
| | - Huaiqin Lin
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Qian Sun
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| |
Collapse
|
8
|
Morrone S, Dimauro C, Gambella F, Cappai MG. Industry 4.0 and Precision Livestock Farming (PLF): An up to Date Overview across Animal Productions. SENSORS (BASEL, SWITZERLAND) 2022; 22:4319. [PMID: 35746102 PMCID: PMC9228240 DOI: 10.3390/s22124319] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/04/2022] [Revised: 05/30/2022] [Accepted: 05/31/2022] [Indexed: 05/14/2023]
Abstract
Precision livestock farming (PLF) has spread to various countries worldwide since its inception in 2003, though it has yet to be widely adopted. Additionally, the advent of Industry 4.0 and the Internet of Things (IoT) have enabled a continued advancement and development of PLF. This modern technological approach to animal farming and production encompasses ethical, economic and logistical aspects. The aim of this review is to provide an overview of PLF and Industry 4.0, to identify current applications of this rather novel approach in different farming systems for food producing animals, and to present up to date knowledge on the subject. Current scientific literature regarding the spread and application of PLF and IoT shows how efficient farm animal management systems are destined to become. Everyday farming practices (feeding and production performance) coupled with continuous and real-time monitoring of animal parameters can have significant impacts on welfare and health assessment, which are current themes of public interest. In the context of feeding a rising global population, the agri-food industry and industry 4.0 technologies may represent key features for successful and sustainable development.
Collapse
Affiliation(s)
- Sarah Morrone
- Department of Veterinary Medicine, University of Sassari, 07100 Sassari, Italy;
| | - Corrado Dimauro
- Research Unit of Animal Breeding Sciences, Department of Agriculture, University of Sassari, 07100 Sassari, Italy;
| | - Filippo Gambella
- Research Unit of Agriculture Mechanics, Department of Agriculture, University of Sassari, 07100 Sassari, Italy;
| | - Maria Grazia Cappai
- Department of Veterinary Medicine, University of Sassari, 07100 Sassari, Italy;
| |
Collapse
|
9
|
Fang C, Zheng H, Yang J, Deng H, Zhang T. Study on Poultry Pose Estimation Based on Multi-Parts Detection. Animals (Basel) 2022; 12:ani12101322. [PMID: 35625168 PMCID: PMC9137532 DOI: 10.3390/ani12101322] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2022] [Revised: 05/15/2022] [Accepted: 05/19/2022] [Indexed: 11/22/2022] Open
Abstract
Simple Summary Poultry farming is an important part of China’s agriculture system. The automatic estimation of poultry posture can help to analyze the movement, behavior, and even health of poultry. In this study, a poultry pose-estimation system was designed, which realized the automatic pose estimation of a single broiler chicken using a multi-part detection method. The experimental results show that this method can obtain better pose-estimation results for a single broiler chicken with respect to precision, recall, and F1 score. The pose-estimation system designed in this study provides a new means to provide help for poultry pose/behavior researchers in the future. Abstract Poultry pose estimation is a prerequisite for evaluating abnormal behavior and disease prediction in poultry. Accurate pose-estimation enables poultry producers to better manage their poultry. Because chickens are group-fed, how to achieve automatic poultry pose recognition has become a problematic point for accurate monitoring in large-scale farms. To this end, based on computer vision technology, this paper uses a deep neural network (DNN) technique to estimate the posture of a single broiler chicken. This method compared the pose detection results with the Single Shot MultiBox Detector (SSD) algorithm, You Only Look Once (YOLOV3) algorithm, RetinaNet algorithm, and Faster_R-CNN algorithm. Preliminary tests show that the method proposed in this paper achieves a 0.0128 standard deviation of precision and 0.9218 ± 0.0048 of confidence (95%) and a 0.0266 standard deviation of recall and 0.8996 ± 0.0099 of confidence (95%). By successfully estimating the pose of broiler chickens, it is possible to facilitate the detection of abnormal behavior of poultry. Furthermore, the method can be further improved to increase the overall success rate of verification.
Collapse
Affiliation(s)
- Cheng Fang
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.); (H.Z.); (J.Y.); (H.D.)
| | - Haikun Zheng
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.); (H.Z.); (J.Y.); (H.D.)
| | - Jikang Yang
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.); (H.Z.); (J.Y.); (H.D.)
| | - Hongfeng Deng
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.); (H.Z.); (J.Y.); (H.D.)
| | - Tiemin Zhang
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.); (H.Z.); (J.Y.); (H.D.)
- National Engineering Research Center for Breeding Swine Industry, Guangzhou 510642, China
- Guangdong Laboratory for Lingnan Modern Agriculture, Guangzhou 510642, China
- Correspondence:
| |
Collapse
|