1
|
Millburn S, Schmidt T, Rohrer GA, Mote B. Identifying Early-Life Behavior to Predict Mothering Ability in Swine Utilizing NU track System. Animals (Basel) 2023; 13:2897. [PMID: 37760297 PMCID: PMC10525931 DOI: 10.3390/ani13182897] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 07/27/2023] [Accepted: 09/02/2023] [Indexed: 09/29/2023] Open
Abstract
Early indicator traits for swine reproduction and longevity support economical selection decision-making. Activity is a key variable impacting a sow's herd life and productivity. Early-life activities could contribute to farrowing traits including gestation length (GL), number born alive (NBA), and number weaned (NW). Beginning at 20 weeks of age, 480 gilts were video recorded for 7 consecutive days and processed using the NUtrack system. Activity traits included angle rotated (radians), average speed (m/s), distance traveled (m), time spent eating (s), lying lateral (s), lying sternal (s), standing (s), and sitting (s). Final daily activity values were averaged across the period under cameras. Parity one data were collected for all gilts considered. Data were analyzed using linear regression models (R version 4.0.2). GL was significantly impacted by angle rotated (p = 0.03), average speed (p = 0.07), distance traveled (p = 0.05), time spent lying lateral (p = 0.003), and lying sternal (0.02). NBA was significantly impacted by time spent lying lateral (p = 0.01), lying sternal (p = 0.07), and time spent sitting (p = 0.08). NW was significantly impacted by time spent eating (p = 0.09), time spent lying lateral (p = 0.04), and time spent sitting (p = 0.007). This analysis suggests early-life gilt activities are associated with sow productivity traits of importance. Further examination of the link between behaviors compiled utilizing NUtrack and reproductive traits is necessitated to further isolate behavioral differences for potential use in selection decisions.
Collapse
Affiliation(s)
- Savannah Millburn
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| | - Ty Schmidt
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| | - Gary A. Rohrer
- United States Meat Animal Research Center, United States Department of Agriculture-Agricultural Research Service, Clay Center, NE 68933, USA
| | - Benny Mote
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| |
Collapse
|
2
|
Li G, Shi G, Jiao J. YOLOv5-KCB: A New Method for Individual Pig Detection Using Optimized K-Means, CA Attention Mechanism and a Bi-Directional Feature Pyramid Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23115242. [PMID: 37299967 DOI: 10.3390/s23115242] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/29/2023] [Revised: 05/26/2023] [Accepted: 05/29/2023] [Indexed: 06/12/2023]
Abstract
Individual identification of pigs is a critical component of intelligent pig farming. Traditional pig ear-tagging requires significant human resources and suffers from issues such as difficulty in recognition and low accuracy. This paper proposes the YOLOv5-KCB algorithm for non-invasive identification of individual pigs. Specifically, the algorithm utilizes two datasets-pig faces and pig necks-which are divided into nine categories. Following data augmentation, the total sample size was augmented to 19,680. The distance metric used for K-means clustering is changed from the original algorithm to 1-IOU, which improves the adaptability of the model's target anchor boxes. Furthermore, the algorithm introduces SE, CBAM, and CA attention mechanisms, with the CA attention mechanism being selected for its superior performance in feature extraction. Finally, CARAFE, ASFF, and BiFPN are used for feature fusion, with BiFPN selected for its superior performance in improving the detection ability of the algorithm. The experimental results indicate that the YOLOv5-KCB algorithm achieved the highest accuracy rates in pig individual recognition, surpassing all other improved algorithms in average accuracy rate (IOU = 0.5). The accuracy rate of pig head and neck recognition was 98.4%, while the accuracy rate for pig face recognition was 95.1%, representing an improvement of 4.8% and 13.8% over the original YOLOv5 algorithm. Notably, the average accuracy rate of identifying pig head and neck was consistently higher than pig face recognition across all algorithms, with YOLOv5-KCB demonstrating an impressive 2.9% improvement. These results emphasize the potential for utilizing the YOLOv5-KCB algorithm for precise individual pig identification, facilitating subsequent intelligent management practices.
Collapse
Affiliation(s)
- Guangbo Li
- School of Information and Computer, Anhui Agricultural University, Hefei 230036, China
| | - Guolong Shi
- School of Information and Computer, Anhui Agricultural University, Hefei 230036, China
- Key Laboratory of Agricultural Sensors, Ministry of Agriculture and Rural Affairs, Hefei 230036, China
- Anhui Provincial Key Laboratory of Smart Agricultural Technology and Equipment, Hefei 230036, China
| | - Jun Jiao
- School of Information and Computer, Anhui Agricultural University, Hefei 230036, China
| |
Collapse
|
3
|
Obermier D, Trenahile-Grannemann M, Schmidt T, Rathje T, Mote B. Utilizing NU track to Access the Activity Levels in Pigs with Varying Degrees of Genetic Potential for Growth and Feed Intake. Animals (Basel) 2023; 13:ani13101581. [PMID: 37238011 DOI: 10.3390/ani13101581] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Revised: 05/03/2023] [Accepted: 05/03/2023] [Indexed: 05/28/2023] Open
Abstract
Feed cost accounts for over two-thirds of the variable cost of production. In order to reduce feed costs without sacrificing production numbers, feed efficiency must be improved. Calorie expenditure has been difficult to quantify in the past but is understood to impact residual feed intake (RFI) greatly. The objective of this work was to utilize an advanced computer vision system to evaluate activity levels across sex and sire groups with different expected breeding value combinations for growth and feed intake. A total of 199 pigs from four different sire groups (DNA Genetics Line 600) High Feed Intake/High Growth (HIHG), Low Feed Intake/High Growth (LIHG), High Feed Intake/Low Growth (HILG), and Low Feed Intake/Low Growth (LILG) were utilized at the UNL ENREC farm over 127 days. The NUtrack system allowed for individual monitoring of pigs in group housing to track daily activity traits. In total, HIHG pigs travelled less (p < 0.05; 139 vs. 150 km), spent more time lying (p < 0.05; 2421 vs. 2391 h), and less time eating (p < 0.05; 235 vs. 243 h) when compared to LILG pigs across time. The results suggest variation in activity occurs across the progeny of the sire groups selected to differentiate in growth and feed intake.
Collapse
Affiliation(s)
- Dalton Obermier
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| | | | - Ty Schmidt
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| | - Tom Rathje
- DNA Swine Genetics, 2415 13th Street, Columbus, NE 68601, USA
| | - Benny Mote
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| |
Collapse
|
4
|
Bortoluzzi EM, Goering MJ, Ochoa SJ, Holliday AJ, Mumm JM, Nelson CE, Wu H, Mote BE, Psota ET, Schmidt TB, Jaberi-Douraki M, Hulbert LE. Evaluation of Precision Livestock Technology and Human Scoring of Nursery Pigs in a Controlled Immune Challenge Experiment. Animals (Basel) 2023; 13:ani13020246. [PMID: 36670787 PMCID: PMC9854951 DOI: 10.3390/ani13020246] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2022] [Revised: 12/08/2022] [Accepted: 01/04/2023] [Indexed: 01/12/2023] Open
Abstract
The objectives were to determine the sensitivity, specificity, and cutoff values of a visual-based precision livestock technology (NUtrack), and determine the sensitivity and specificity of sickness score data collected with the live observation by trained human observers. At weaning, pigs (n = 192; gilts and barrows) were randomly assigned to one of twelve pens (16/pen) and treatments were randomly assigned to pens. Sham-pen pigs all received subcutaneous saline (3 mL). For LPS-pen pigs, all pigs received subcutaneous lipopolysaccharide (LPS; 300 μg/kg BW; E. coli O111:B4; in 3 mL of saline). For the last treatment, eight pigs were randomly assigned to receive LPS, and the other eight were sham (same methods as above; half-and-half pens). Human data from the day of the challenge presented high true positive and low false positive rates (88.5% sensitivity; 85.4% specificity; 0.871 Area Under Curve, AUC), however, these values declined when half-and-half pigs were scored (75% sensitivity; 65.5% specificity; 0.703 AUC). Precision technology measures had excellent AUC, sensitivity, and specificity for the first 72 h after treatment and AUC values were >0.970, regardless of pen treatment. These results indicate that precision technology has a greater potential for identifying pigs during a natural infectious disease event than trained professionals using timepoint sampling.
Collapse
Affiliation(s)
- Eduarda M. Bortoluzzi
- Department of Animal Sciences and Industry, Kansas State University, Manhattan, KS 66506, USA
| | - Mikayla J. Goering
- Department of Animal Sciences and Industry, Kansas State University, Manhattan, KS 66506, USA
| | - Sara J. Ochoa
- Department of Animal Sciences and Industry, Kansas State University, Manhattan, KS 66506, USA
| | - Aaron J. Holliday
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68505, USA
| | - Jared M. Mumm
- Department of Animal Sciences and Industry, Kansas State University, Manhattan, KS 66506, USA
| | - Catherine E. Nelson
- Department of Animal Sciences and Industry, Kansas State University, Manhattan, KS 66506, USA
| | - Hui Wu
- Department of Statistics, Kansas State University, Manhattan, KS 66506, USA
| | - Benny E. Mote
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68505, USA
| | - Eric T. Psota
- Department of Electrical and Computer Engineering, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| | - Ty B. Schmidt
- Department of Animal Science, University of Nebraska-Lincoln, Lincoln, NE 68505, USA
| | - Majid Jaberi-Douraki
- Department of Statistics, Kansas State University, Manhattan, KS 66506, USA
- Department of Mathematics, Kansas State University, Manhattan, KS 66506, USA
- 1-DATA, Kansas State University Olathe, Olathe, KS 66061, USA
| | - Lindsey E. Hulbert
- Department of Animal Sciences and Industry, Kansas State University, Manhattan, KS 66506, USA
- Correspondence: ; Tel.: +1-785-477-2904
| |
Collapse
|
5
|
Bresolin T, Ferreira R, Reyes F, Van Os J, Dórea J. Assessing optimal frequency for image acquisition in computer vision systems developed to monitor feeding behavior of group-housed Holstein heifers. J Dairy Sci 2022; 106:664-675. [DOI: 10.3168/jds.2022-22138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2022] [Accepted: 08/02/2022] [Indexed: 11/05/2022]
|
6
|
Son S, Ahn H, Baek H, Yu S, Suh Y, Lee S, Chung Y, Park D. StaticPigDet: Accuracy Improvement of Static Camera-Based Pig Monitoring Using Background and Facility Information. SENSORS (BASEL, SWITZERLAND) 2022; 22:8315. [PMID: 36366013 PMCID: PMC9655159 DOI: 10.3390/s22218315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 10/23/2022] [Accepted: 10/27/2022] [Indexed: 06/16/2023]
Abstract
The automatic detection of individual pigs can improve the overall management of pig farms. The accuracy of single-image object detection has significantly improved over the years with advancements in deep learning techniques. However, differences in pig sizes and complex structures within pig pen of a commercial pig farm, such as feeding facilities, present challenges to the detection accuracy for pig monitoring. To implement such detection in practice, the differences should be analyzed by video recorded from a static camera. To accurately detect individual pigs that may be different in size or occluded by complex structures, we present a deep-learning-based object detection method utilizing generated background and facility information from image sequences (i.e., video) recorded from a static camera, which contain relevant information. As all images are preprocessed to reduce differences in pig sizes. We then used the extracted background and facility information to create different combinations of gray images. Finally, these images are combined into different combinations of three-channel composite images, which are used as training datasets to improve detection accuracy. Using the proposed method as a component of image processing improved overall accuracy from 84% to 94%. From the study, an accurate facility and background image was able to be generated after updating for a long time that helped detection accuracy. For the further studies, improving detection accuracy on overlapping pigs can also be considered.
Collapse
Affiliation(s)
- Seungwook Son
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Hanse Ahn
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Hwapyeong Baek
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Seunghyun Yu
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Yooil Suh
- Info Valley Korea Co., Ltd., Anyang 14067, Korea
| | - Sungju Lee
- Department of Software, Sangmyung University, Cheonan 31066, Korea
| | - Yongwha Chung
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| | - Daihee Park
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea
| |
Collapse
|
7
|
Lu J, Wang W, Zhao K, Wang H. Recognition and segmentation of individual pigs based on Swin Transformer. Anim Genet 2022; 53:794-802. [PMID: 36146894 DOI: 10.1111/age.13259] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Revised: 08/18/2022] [Accepted: 08/21/2022] [Indexed: 11/28/2022]
Abstract
Recognition of individual pigs is critical to the monitoring of pig body size and physiological health status in large-scale pig farms. In this study, deep learning methods were introduced in the intelligent recognition and segmentation of individual replacement pigs, which can realize the non-contact surveillance of each pig. Swin Transformer was used for the recognition and segmentation of individual pigs based on the surveillance data, and different models were compared to find the model with the fastest training speed and most accurate results. Finally, a recognition accuracy of 93.0% and segmentation accuracy of 86.9% for individual pigs were achieved with the surveillance video images of pigs based on Swin Transformer. Even in some complex scenarios such as overlapping, occlusion, and deformation, the method still exhibited excellent recognition performance for replacement pigs. Importantly, this method can greatly save labor as well as help intelligent and unmanned pig production and facilitate the modernization of pig industry.
Collapse
Affiliation(s)
- Jisheng Lu
- Key Laboratory of Smart Farming for Agricultural Animals, Ministry of Agriculture and Rural Affairs, College of Informatics, Huazhong Agricultural University, Wuhan, China.,Shenzhen Institute of Nutrition and Health, Huazhong Agricultural University, Wuhan, China
| | - Wei Wang
- Key Laboratory of Smart Farming for Agricultural Animals, Ministry of Agriculture and Rural Affairs, College of Informatics, Huazhong Agricultural University, Wuhan, China.,Shenzhen Institute of Nutrition and Health, Huazhong Agricultural University, Wuhan, China
| | - Kun Zhao
- Information Technology Center, Huazhong Agricultural University, Wuhan, China
| | - Haiyan Wang
- Key Laboratory of Smart Farming for Agricultural Animals, Ministry of Agriculture and Rural Affairs, College of Informatics, Huazhong Agricultural University, Wuhan, China.,Shenzhen Institute of Nutrition and Health, Huazhong Agricultural University, Wuhan, China
| |
Collapse
|
8
|
Gorssen W, Winters C, Meyermans R, D’Hooge R, Janssens S, Buys N. Estimating genetics of body dimensions and activity levels in pigs using automated pose estimation. Sci Rep 2022; 12:15384. [PMID: 36100692 PMCID: PMC9470733 DOI: 10.1038/s41598-022-19721-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Accepted: 09/02/2022] [Indexed: 11/09/2022] Open
Abstract
Pig breeding is changing rapidly due to technological progress and socio-ecological factors. New precision livestock farming technologies such as computer vision systems are crucial for automated phenotyping on a large scale for novel traits, as pigs’ robustness and behavior are gaining importance in breeding goals. However, individual identification, data processing and the availability of adequate (open source) software currently pose the main hurdles. The overall goal of this study was to expand pig weighing with automated measurements of body dimensions and activity levels using an automated video-analytic system: DeepLabCut. Furthermore, these data were coupled with pedigree information to estimate genetic parameters for breeding programs. We analyzed 7428 recordings over the fattening period of 1556 finishing pigs (Piétrain sire x crossbred dam) with two-week intervals between recordings on the same pig. We were able to accurately estimate relevant body parts with an average tracking error of 3.3 cm. Body metrics extracted from video images were highly heritable (61–74%) and significantly genetically correlated with average daily gain (rg = 0.81–0.92). Activity traits were low to moderately heritable (22–35%) and showed low genetic correlations with production traits and physical abnormalities. We demonstrated a simple and cost-efficient method to extract body dimension parameters and activity traits. These traits were estimated to be heritable, and hence, can be selected on. These findings are valuable for (pig) breeding organizations, as they offer a method to automatically phenotype new production and behavioral traits on an individual level.
Collapse
|
9
|
Schmidt TB, Lancaster JM, Psota E, Mote BE, Hulbert LE, Holliday A, Woiwode R, Pérez LC. Evaluation of a novel computer vision-based livestock monitoring system to identify and track specific behaviors of individual nursery pigs within a group-housed environment. Transl Anim Sci 2022; 6:txac082. [PMID: 35875422 PMCID: PMC9298813 DOI: 10.1093/tas/txac082] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Accepted: 06/15/2022] [Indexed: 11/14/2022] Open
Abstract
Abstract
Animal behavior is indicative of health status and changes in behavior can indicate health issues (i.e., illness, stress, or injury). Currently, human observation (HO) is the only method for detecting behavior changes that may indicate problems in group-housed pigs. While HO is effective, limitations exist. Limitations include HO being time consuming, HO obfuscates natural behaviors, and it is not possible to maintain continuous HO. To address these limitations, a computer vision platform (NUtrack) was developed to identify (ID) and continuously monitor specific behaviors of group-housed pigs on an individual basis. The objectives of this study were to evaluate the capabilities of the NUtrack system and evaluate changes in behavior patterns over time of group-housed nursery pigs. The NUtrack system was installed above four nursery pens to monitor the behavior of 28 newly weaned pigs during a 42-d nursery period. Pigs were stratified by sex, litter, and randomly assigned to one of two pens (14 pigs/pen) for the first 22 d. On day 23, pigs were split into four pens (7 pigs/pen). To evaluate the NUtrack system’s capabilities, 800 video frames containing 11,200 individual observations were randomly selected across the nursery period. Each frame was visually evaluated to verify the NUtrack system’s accuracy for ID and classification of behavior. The NUtrack system achieved an overall accuracy for ID of 95.6%. This accuracy for ID was 93.5% during the first 22 d and increased (P < 0.001) to 98.2% for the final 20 d. Of the ID errors, 72.2% were due to mislabeled ID and 27.8% were due to loss of ID. The NUtrack system classified lying, standing, walking, at the feeder (ATF), and at the waterer (ATW) behaviors accurately at a rate of 98.7%, 89.7%, 88.5%, 95.6%, and 79.9%, respectively. Behavior data indicated that the time budget for lying, standing, and walking in nursery pigs was 77.7% ± 1.6%, 8.5% ± 1.1%, and 2.9% ± 0.4%, respectively. In addition, behavior data indicated that nursery pigs spent 9.9% ± 1.7% and 1.0% ± 0.3% time ATF and ATW, respectively. Results suggest that the NUtrack system can detect, identify, maintain ID, and classify specific behavior of group-housed nursery pigs for the duration of the 42-d nursery period. Overall, results suggest that, with continued research, the NUtrack system may provide a viable real-time precision livestock tool with the ability to assist producers in monitoring behaviors and potential changes in the behavior of group-housed pigs.
Collapse
Affiliation(s)
- Ty B Schmidt
- Department of Animal Science, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| | - Jessica M Lancaster
- Department of Animal Science, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| | - Eric Psota
- Electrical and Computer Engineering, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| | - Benny E Mote
- Department of Animal Science, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| | - Lindsey E Hulbert
- Animal Science and Industry, Kansas State University , Manhattan, KS 66506 , USA
| | - Aaron Holliday
- Department of Animal Science, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| | - Ruth Woiwode
- Department of Animal Science, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| | - Lance C Pérez
- College of Engineering, University of Nebraska - Lincoln , Lincoln, NE 68583 , USA
| |
Collapse
|
10
|
Handa D, Peschel JM. A Review of Monitoring Techniques for Livestock Respiration and Sounds. FRONTIERS IN ANIMAL SCIENCE 2022. [DOI: 10.3389/fanim.2022.904834] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
This article reviews the different techniques used to monitor the respiration and sounds of livestock. Livestock respiration is commonly assessed visually by observing abdomen fluctuation; however, the traditional methods are time consuming, subjective, being therefore impractical for large-scale operations and must rely on automation. Contact and non-contact technologies are used to automatically monitor respiration rate; contact technologies (e.g., accelerometers, pressure sensors, and thermistors) utilize sensors that are physically mounted on livestock while non-contact technologies (e.g., computer vision, thermography, and sound analysis) enable a non-invasive method of monitoring respiration. This work summarizes the advantages and disadvantages of contact and non-contact technologies and discusses the emerging role of non-contact sensors in automating monitoring for large-scale farming operations. This work is the first in-depth examination of automated monitoring technologies for livestock respiratory diseases; the findings and recommendations are important for livestock researchers and practitioners who can gain a better understanding of these different technologies, especially emerging non-contact sensing.
Collapse
|
11
|
A Large-Scale Mouse Pose Dataset for Mouse Pose Estimation. Symmetry (Basel) 2022. [DOI: 10.3390/sym14050875] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Mouse pose estimations have important applications in the fields of animal behavior research, biomedicine, and animal conservation studies. Accurate and efficient mouse pose estimations using computer vision are necessary. Although methods for mouse pose estimations have developed, bottlenecks still exist. One of the most prominent problems is the lack of uniform and standardized training datasets. Here, we resolve this difficulty by introducing the mouse pose dataset. Our mouse pose dataset contains 40,000 frames of RGB images and large-scale 2D ground-truth motion images. All the images were captured from interacting lab mice through a stable single viewpoint, including 5 distinct species and 20 mice in total. Moreover, to improve the annotation efficiency, five keypoints of mice are creatively proposed, in which one keypoint is at the center and the other two pairs of keypoints are symmetric. Then, we created simple, yet effective software that works for annotating images. It is another important link to establish a benchmark model for 2D mouse pose estimations. We employed modified object detections and pose estimation algorithms to achieve precise, effective, and robust performances. As the first large and standardized mouse pose dataset, our proposed mouse pose dataset will help advance research on animal pose estimations and assist in application areas related to animal experiments.
Collapse
|
12
|
Kim J, Suh Y, Lee J, Chae H, Ahn H, Chung Y, Park D. EmbeddedPigCount: Pig Counting with Video Object Detection and Tracking on an Embedded Board. SENSORS 2022; 22:s22072689. [PMID: 35408302 PMCID: PMC9002707 DOI: 10.3390/s22072689] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/26/2022] [Revised: 03/28/2022] [Accepted: 03/28/2022] [Indexed: 12/10/2022]
Abstract
Knowing the number of pigs on a large-scale pig farm is an important issue for efficient farm management. However, counting the number of pigs accurately is difficult for humans because pigs do not obediently stop or slow down for counting. In this study, we propose a camera-based automatic method to count the number of pigs passing through a counting zone. That is, using a camera in a hallway, our deep-learning-based video object detection and tracking method analyzes video streams and counts the number of pigs passing through the counting zone. Furthermore, to execute the counting method in real time on a low-cost embedded board, we consider the tradeoff between accuracy and execution time, which has not yet been reported for pig counting. Our experimental results on an NVIDIA Jetson Nano embedded board show that this “light-weight” method is effective for counting the passing-through pigs, in terms of both accuracy (i.e., 99.44%) and execution time (i.e., real-time execution), even when some pigs pass through the counting zone back and forth.
Collapse
Affiliation(s)
- Jonggwan Kim
- Info Valley Korea Co., Ltd., Anyang-si 14067, Korea; (J.K.); (Y.S.); (J.L.); (H.C.)
| | - Yooil Suh
- Info Valley Korea Co., Ltd., Anyang-si 14067, Korea; (J.K.); (Y.S.); (J.L.); (H.C.)
| | - Junhee Lee
- Info Valley Korea Co., Ltd., Anyang-si 14067, Korea; (J.K.); (Y.S.); (J.L.); (H.C.)
| | - Heechan Chae
- Info Valley Korea Co., Ltd., Anyang-si 14067, Korea; (J.K.); (Y.S.); (J.L.); (H.C.)
| | - Hanse Ahn
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea; (H.A.); (D.P.)
| | - Yongwha Chung
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea; (H.A.); (D.P.)
- Correspondence: ; Tel.: +82-44-860-1343
| | - Daihee Park
- Department of Computer Convergence Software, Korea University, Sejong 30019, Korea; (H.A.); (D.P.)
| |
Collapse
|
13
|
Qiao Y, Clark C, Lomax S, Kong H, Su D, Sukkarieh S. Automated Individual Cattle Identification Using Video Data: A Unified Deep Learning Architecture Approach. FRONTIERS IN ANIMAL SCIENCE 2021. [DOI: 10.3389/fanim.2021.759147] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Individual cattle identification is a prerequisite and foundation for precision livestock farming. Existing methods for cattle identification require radio frequency or visual ear tags, all of which are prone to loss or damage. Here, we propose and implement a new unified deep learning approach to cattle identification using video analysis. The proposed deep learning framework is composed of a Convolutional Neural Network (CNN) and Bidirectional Long Short-Term Memory (BiLSTM) with a self-attention mechanism. More specifically, the Inception-V3 CNN was used to extract features from a cattle video dataset taken in a feedlot with rear-view. Extracted features were then fed to a BiLSTM layer to capture spatio-temporal information. Then, self-attention was employed to provide a different focus on the features captured by BiLSTM for the final step of cattle identification. We used a total of 363 rear-view videos from 50 cattle at three different times with an interval of 1 month between data collection periods. The proposed method achieved 93.3% identification accuracy using a 30-frame video length, which outperformed current state-of-the-art methods (Inception-V3, MLP, SimpleRNN, LSTM, and BiLSTM). Furthermore, two different attention schemes, namely, additive and multiplicative attention mechanisms were compared. Our results show that the additive attention mechanism achieved 93.3% accuracy and 91.0% recall, greater than multiplicative attention mechanism with 90.7% accuracy and 87.0% recall. Video length also impacted accuracy, with video sequence length up to 30-frames enhancing identification performance. Overall, our approach can capture key spatio-temporal features to improve cattle identification accuracy, enabling automated cattle identification for precision livestock farming.
Collapse
|
14
|
Detecting Animal Contacts-A Deep Learning-Based Pig Detection and Tracking Approach for the Quantification of Social Contacts. SENSORS 2021; 21:s21227512. [PMID: 34833588 PMCID: PMC8619108 DOI: 10.3390/s21227512] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Revised: 11/02/2021] [Accepted: 11/10/2021] [Indexed: 11/25/2022]
Abstract
The identification of social interactions is of fundamental importance for animal behavioral studies, addressing numerous problems like investigating the influence of social hierarchical structures or the drivers of agonistic behavioral disorders. However, the majority of previous studies often rely on manual determination of the number and types of social encounters by direct observation which requires a large amount of personnel and economical efforts. To overcome this limitation and increase research efficiency and, thus, contribute to animal welfare in the long term, we propose in this study a framework for the automated identification of social contacts. In this framework, we apply a convolutional neural network (CNN) to detect the location and orientation of pigs within a video and track their movement trajectories over a period of time using a Kalman filter (KF) algorithm. Based on the tracking information, we automatically identify social contacts in the form of head–head and head–tail contacts. Moreover, by using the individual animal IDs, we construct a network of social contacts as the final output. We evaluated the performance of our framework based on two distinct test sets for pig detection and tracking. Consequently, we achieved a Sensitivity, Precision, and F1-score of 94.2%, 95.4%, and 95.1%, respectively, and a MOTA score of 94.4%. The findings of this study demonstrate the effectiveness of our keypoint-based tracking-by-detection strategy and can be applied to enhance animal monitoring systems.
Collapse
|
15
|
Borges Oliveira DA, Ribeiro Pereira LG, Bresolin T, Pontes Ferreira RE, Reboucas Dorea JR. A review of deep learning algorithms for computer vision systems in livestock. Livest Sci 2021. [DOI: 10.1016/j.livsci.2021.104700] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
16
|
Tzanidakis C, Simitzis P, Arvanitis K, Panagakis P. An overview of the current trends in precision pig farming technologies. Livest Sci 2021. [DOI: 10.1016/j.livsci.2021.104530] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
|
17
|
Racewicz P, Ludwiczak A, Skrzypczak E, Składanowska-Baryza J, Biesiada H, Nowak T, Nowaczewski S, Zaborowicz M, Stanisz M, Ślósarz P. Welfare Health and Productivity in Commercial Pig Herds. Animals (Basel) 2021; 11:1176. [PMID: 33924224 PMCID: PMC8074599 DOI: 10.3390/ani11041176] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Revised: 04/16/2021] [Accepted: 04/17/2021] [Indexed: 12/02/2022] Open
Abstract
In recent years, there have been very dynamic changes in both pork production and pig breeding technology around the world. The general trend of increasing the efficiency of pig production, with reduced employment, requires optimisation and a comprehensive approach to herd management. One of the most important elements on the way to achieving this goal is to maintain animal welfare and health. The health of the pigs on the farm is also a key aspect in production economics. The need to maintain a high health status of pig herds by eliminating the frequency of different disease units and reducing the need for antimicrobial substances is part of a broadly understood high potential herd management strategy. Thanks to the use of sensors (cameras, microphones, accelerometers, or radio-frequency identification transponders), the images, sounds, movements, and vital signs of animals are combined through algorithms and analysed for non-invasive monitoring of animals, which allows for early detection of diseases, improves their welfare, and increases the productivity of breeding. Automated, innovative early warning systems based on continuous monitoring of specific physiological (e.g., body temperature) and behavioural parameters can provide an alternative to direct diagnosis and visual assessment by the veterinarian or the herd keeper.
Collapse
Affiliation(s)
- Przemysław Racewicz
- Laboratory of Veterinary Public Health Protection, Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland;
| | - Agnieszka Ludwiczak
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Ewa Skrzypczak
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Joanna Składanowska-Baryza
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Hanna Biesiada
- Laboratory of Veterinary Public Health Protection, Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland;
| | - Tomasz Nowak
- Department of Genetics and Animal Breeding, Animal Reproduction Laboratory, Poznan University of Life Sciences, 60-637 Poznan, Poland;
| | - Sebastian Nowaczewski
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Maciej Zaborowicz
- Institute of Biosystems Engineering, Poznan University of Life Sciences, 60-637 Poznan, Poland;
| | - Marek Stanisz
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Piotr Ślósarz
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| |
Collapse
|
18
|
Pérez-Enciso M, Steibel JP. Phenomes: the current frontier in animal breeding. Genet Sel Evol 2021; 53:22. [PMID: 33673800 PMCID: PMC7934239 DOI: 10.1186/s12711-021-00618-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Accepted: 02/22/2021] [Indexed: 12/13/2022] Open
Abstract
Improvements in genomic technologies have outpaced the most optimistic predictions, allowing industry-scale application of genomic selection. However, only marginal gains in genetic prediction accuracy can now be expected by increasing marker density up to sequence, unless causative mutations are identified. We argue that some of the most scientifically disrupting and industry-relevant challenges relate to ‘phenomics’ instead of ‘genomics’. Thanks to developments in sensor technology and artificial intelligence, there is a wide range of analytical tools that are already available and many more will be developed. We can now address some of the pressing societal demands on the industry, such as animal welfare concerns or efficiency in the use of resources. From the statistical and computational point of view, phenomics raises two important issues that require further work: penalization and dimension reduction. This will be complicated by the inherent heterogeneity and ‘missingness’ of the data. Overall, we can expect that precision livestock technologies will make it possible to collect hundreds of traits on a continuous basis from large numbers of animals. Perhaps the main revolution will come from redesigning animal breeding schemes to explicitly allow for high-dimensional phenomics. In the meantime, phenomics data will definitely enlighten our knowledge on the biological basis of phenotypes.
Collapse
Affiliation(s)
- Miguel Pérez-Enciso
- ICREA, Passeig de Lluís Companys 23, 08010, Barcelona, Spain. .,Centre for Research in Agricultural Genomics (CRAG), CSIC-IRTA-UAB-UB, Bellaterra, 08193, Barcelona, Spain.
| | - Juan P Steibel
- Department of Animal Science, Michigan State University, East Lansing, MI, 48824, USA.,Department of Fisheries and Wildlife, Michigan State University, East Lansing, MI, 48824, USA
| |
Collapse
|
19
|
Li G, Huang Y, Chen Z, Chesser GD, Purswell JL, Linhoss J, Zhao Y. Practices and Applications of Convolutional Neural Network-Based Computer Vision Systems in Animal Farming: A Review. SENSORS (BASEL, SWITZERLAND) 2021; 21:1492. [PMID: 33670030 PMCID: PMC7926480 DOI: 10.3390/s21041492] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/07/2021] [Revised: 02/03/2021] [Accepted: 02/19/2021] [Indexed: 01/28/2023]
Abstract
Convolutional neural network (CNN)-based computer vision systems have been increasingly applied in animal farming to improve animal management, but current knowledge, practices, limitations, and solutions of the applications remain to be expanded and explored. The objective of this study is to systematically review applications of CNN-based computer vision systems on animal farming in terms of the five deep learning computer vision tasks: image classification, object detection, semantic/instance segmentation, pose estimation, and tracking. Cattle, sheep/goats, pigs, and poultry were the major farm animal species of concern. In this research, preparations for system development, including camera settings, inclusion of variations for data recordings, choices of graphics processing units, image preprocessing, and data labeling were summarized. CNN architectures were reviewed based on the computer vision tasks in animal farming. Strategies of algorithm development included distribution of development data, data augmentation, hyperparameter tuning, and selection of evaluation metrics. Judgment of model performance and performance based on architectures were discussed. Besides practices in optimizing CNN-based computer vision systems, system applications were also organized based on year, country, animal species, and purposes. Finally, recommendations on future research were provided to develop and improve CNN-based computer vision systems for improved welfare, environment, engineering, genetics, and management of farm animals.
Collapse
Affiliation(s)
- Guoming Li
- Department of Agricultural and Biological Engineering, Mississippi State University, Starkville, MS 39762, USA; (G.L.); (J.L.)
| | - Yanbo Huang
- Agricultural Research Service, Genetics and Sustainable Agriculture Unit, United States Department of Agriculture, Starkville, MS 39762, USA;
| | - Zhiqian Chen
- Department of Computer Science and Engineering, Mississippi State University, Starkville, MS 39762, USA;
| | - Gary D. Chesser
- Department of Agricultural and Biological Engineering, Mississippi State University, Starkville, MS 39762, USA; (G.L.); (J.L.)
| | - Joseph L. Purswell
- Agricultural Research Service, Poultry Research Unit, United States Department of Agriculture, Starkville, MS 39762, USA;
| | - John Linhoss
- Department of Agricultural and Biological Engineering, Mississippi State University, Starkville, MS 39762, USA; (G.L.); (J.L.)
| | - Yang Zhao
- Department of Animal Science, The University of Tennessee, Knoxville, TN 37996, USA
| |
Collapse
|
20
|
Investigation of Pig Activity Based on Video Data and Semi-Supervised Neural Networks. AGRIENGINEERING 2020. [DOI: 10.3390/agriengineering2040039] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The activity level of pigs is an important stress indicator which can be associated to tail-biting, a major issue for animal welfare of domestic pigs in conventional housing systems. Although the consideration of the animal activity could be essential to detect tail-biting before an outbreak occurs, it is often manually assessed and therefore labor intense, cost intensive and impracticable on a commercial scale. Recent advances of semi- and unsupervised convolutional neural networks (CNNs) have made them to the state of art technology for detecting anomalous behavior patterns in a variety of complex scene environments. In this study we apply such a CNN for anomaly detection to identify varying levels of activity in a multi-pen problem setup. By applying a two-stage approach we first trained the CNN to detect anomalies in the form of extreme activity behavior. Second, we trained a classifier to categorize the detected anomaly scores by learning the potential activity range of each pen. We evaluated our framework by analyzing 82 manually rated videos and achieved a success rate of 91%. Furthermore, we compared our model with a motion history image (MHI) approach and a binary image approach using two benchmark data sets, i.e., the well established pedestrian data sets published by the University of California, San Diego (UCSD) and our pig data set. The results show the effectiveness of our framework, which can be applied without the need of a labor intense manual annotation process and can be utilized for the assessment of the pig activity in a variety of applications like early warning systems to detect changes in the state of health.
Collapse
|
21
|
|
22
|
Fuentes S, Gonzalez Viejo C, Chauhan SS, Joy A, Tongson E, Dunshea FR. Non-Invasive Sheep Biometrics Obtained by Computer Vision Algorithms and Machine Learning Modeling Using Integrated Visible/Infrared Thermal Cameras. SENSORS (BASEL, SWITZERLAND) 2020; 20:E6334. [PMID: 33171995 PMCID: PMC7664231 DOI: 10.3390/s20216334] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/26/2020] [Revised: 11/02/2020] [Accepted: 11/04/2020] [Indexed: 01/05/2023]
Abstract
Live sheep export has become a public concern. This study aimed to test a non-contact biometric system based on artificial intelligence to assess heat stress of sheep to be potentially used as automated animal welfare assessment in farms and while in transport. Skin temperature (°C) from head features were extracted from infrared thermal videos (IRTV) using automated tracking algorithms. Two parameter engineering procedures from RGB videos were performed to assess Heart Rate (HR) in beats per minute (BPM) and respiration rate (RR) in breaths per minute (BrPM): (i) using changes in luminosity of the green (G) channel and (ii) changes in the green to red (a) from the CIELAB color scale. A supervised machine learning (ML) classification model was developed using raw RR parameters as inputs to classify cutoff frequencies for low, medium, and high respiration rate (Model 1). A supervised ML regression model was developed using raw HR and RR parameters from Model 1 (Model 2). Results showed that Models 1 and 2 were highly accurate in the estimation of RR frequency level with 96% overall accuracy (Model 1), and HR and RR with R = 0.94 and slope = 0.76 (Model 2) without statistical signs of overfitting.
Collapse
Affiliation(s)
- Sigfredo Fuentes
- Digital Agriculture, Food and Wine Sciences Group, School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Parkville, VIC 3010, Australia; (C.G.V.); (E.T.)
| | - Claudia Gonzalez Viejo
- Digital Agriculture, Food and Wine Sciences Group, School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Parkville, VIC 3010, Australia; (C.G.V.); (E.T.)
| | - Surinder S. Chauhan
- Animal Nutrition and Physiology, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Parkville 3010, Australia; (S.S.C.); (A.J.); (F.R.D.)
| | - Aleena Joy
- Animal Nutrition and Physiology, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Parkville 3010, Australia; (S.S.C.); (A.J.); (F.R.D.)
| | - Eden Tongson
- Digital Agriculture, Food and Wine Sciences Group, School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Parkville, VIC 3010, Australia; (C.G.V.); (E.T.)
| | - Frank R. Dunshea
- Animal Nutrition and Physiology, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Parkville 3010, Australia; (S.S.C.); (A.J.); (F.R.D.)
- Faculty of Biological Sciences, The University of Leeds, Leeds LS2 9JT, UK
| |
Collapse
|
23
|
Fernandes AFA, Dórea JRR, Rosa GJDM. Image Analysis and Computer Vision Applications in Animal Sciences: An Overview. Front Vet Sci 2020; 7:551269. [PMID: 33195522 PMCID: PMC7609414 DOI: 10.3389/fvets.2020.551269] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2020] [Accepted: 09/15/2020] [Indexed: 11/13/2022] Open
Abstract
Computer Vision, Digital Image Processing, and Digital Image Analysis can be viewed as an amalgam of terms that very often are used to describe similar processes. Most of this confusion arises because these are interconnected fields that emerged with the development of digital image acquisition. Thus, there is a need to understand the connection between these fields, how a digital image is formed, and the differences regarding the many sensors available, each best suited for different applications. From the advent of the charge-coupled devices demarking the birth of digital imaging, the field has advanced quite fast. Sensors have evolved from grayscale to color with increasingly higher resolution and better performance. Also, many other sensors have appeared, such as infrared cameras, stereo imaging, time of flight sensors, satellite, and hyperspectral imaging. There are also images generated by other signals, such as sound (ultrasound scanners and sonars) and radiation (standard x-ray and computed tomography), which are widely used to produce medical images. In animal and veterinary sciences, these sensors have been used in many applications, mostly under experimental conditions and with just some applications yet developed on commercial farms. Such applications can range from the assessment of beef cuts composition to live animal identification, tracking, behavior monitoring, and measurement of phenotypes of interest, such as body weight, condition score, and gait. Computer vision systems (CVS) have the potential to be used in precision livestock farming and high-throughput phenotyping applications. We believe that the constant measurement of traits through CVS can reduce management costs and optimize decision-making in livestock operations, in addition to opening new possibilities in selective breeding. Applications of CSV are currently a growing research area and there are already commercial products available. However, there are still challenges that demand research for the successful development of autonomous solutions capable of delivering critical information. This review intends to present significant developments that have been made in CVS applications in animal and veterinary sciences and to highlight areas in which further research is still needed before full deployment of CVS in breeding programs and commercial farms.
Collapse
Affiliation(s)
| | | | - Guilherme Jordão de Magalhães Rosa
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI, United States.,Department of Biostatistics and Medical Informatics, University of Wisconsin-Madison, Madison, WI, United States
| |
Collapse
|
24
|
Alameer A, Kyriazakis I, Bacardit J. Automated recognition of postures and drinking behaviour for the detection of compromised health in pigs. Sci Rep 2020; 10:13665. [PMID: 32788633 PMCID: PMC7423952 DOI: 10.1038/s41598-020-70688-6] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2020] [Accepted: 07/30/2020] [Indexed: 11/12/2022] Open
Abstract
Changes in pig behaviours are a useful aid in detecting early signs of compromised health and welfare. In commercial settings, automatic detection of pig behaviours through visual imaging remains a challenge due to farm demanding conditions, e.g., occlusion of one pig from another. Here, two deep learning-based detector methods were developed to identify pig postures and drinking behaviours of group-housed pigs. We first tested the system ability to detect changes in these measures at group-level during routine management. We then demonstrated the ability of our automated methods to identify behaviours of individual animals with a mean average precision of \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$0.989 \pm 0.009$$\end{document}0.989±0.009, under a variety of settings. When the pig feeding regime was disrupted, we automatically detected the expected deviations from the daily feeding routine in standing, lateral lying and drinking behaviours. These experiments demonstrate that the method is capable of robustly and accurately monitoring individual pig behaviours under commercial conditions, without the need for additional sensors or individual pig identification, hence providing a scalable technology to improve the health and well-being of farm animals. The method has the potential to transform how livestock are monitored and address issues in livestock farming, such as targeted treatment of individuals with medication.
Collapse
Affiliation(s)
- Ali Alameer
- School of Natural and Environmental Sciences, Newcastle University, Newcastle Upon Tyne, NE1 7RU, UK. .,School of Computing, Newcastle University, Newcastle Upon Tyne, NE4 5TG, UK.
| | - Ilias Kyriazakis
- Institute for Global Food Security, Queen's University, Belfast, BT9 5DL, UK
| | - Jaume Bacardit
- School of Computing, Newcastle University, Newcastle Upon Tyne, NE4 5TG, UK
| |
Collapse
|
25
|
Editorial: Special Issue "Emerging Sensor Technology in Agriculture". SENSORS 2020; 20:s20143827. [PMID: 32659990 PMCID: PMC7411579 DOI: 10.3390/s20143827] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 07/03/2020] [Accepted: 07/06/2020] [Indexed: 11/16/2022]
Abstract
Research and innovation activities in the area of sensor technology can accelerate the adoption of new and emerging digital tools in the agricultural sector by the implementation of precision farming practices such as remote sensing, operations, and real-time monitoring [...].
Collapse
|
26
|
Brünger J, Gentz M, Traulsen I, Koch R. Panoptic Segmentation of Individual Pigs for Posture Recognition. SENSORS (BASEL, SWITZERLAND) 2020; 20:E3710. [PMID: 32630794 PMCID: PMC7374502 DOI: 10.3390/s20133710] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/21/2020] [Revised: 06/23/2020] [Accepted: 06/29/2020] [Indexed: 11/30/2022]
Abstract
Behavioural research of pigs can be greatly simplified if automatic recognition systems are used. Systems based on computer vision in particular have the advantage that they allow an evaluation without affecting the normal behaviour of the animals. In recent years, methods based on deep learning have been introduced and have shown excellent results. Object and keypoint detector have frequently been used to detect individual animals. Despite promising results, bounding boxes and sparse keypoints do not trace the contours of the animals, resulting in a lot of information being lost. Therefore, this paper follows the relatively new approach of panoptic segmentation and aims at the pixel accurate segmentation of individual pigs. A framework consisting of a neural network for semantic segmentation as well as different network heads and postprocessing methods will be discussed. The method was tested on a data set of 1000 hand-labeled images created specifically for this experiment and achieves detection rates of around 95% (F1 score) despite disturbances such as occlusions and dirty lenses.
Collapse
Affiliation(s)
- Johannes Brünger
- Department of Computer Science, Kiel University, 24118 Kiel, Germany;
| | - Maria Gentz
- Department of Animal Sciences, Livestock Systems, Georg-August-University Göttingen, 37075 Göttingen, Germany; (M.G.); (I.T.)
| | - Imke Traulsen
- Department of Animal Sciences, Livestock Systems, Georg-August-University Göttingen, 37075 Göttingen, Germany; (M.G.); (I.T.)
| | - Reinhard Koch
- Department of Computer Science, Kiel University, 24118 Kiel, Germany;
| |
Collapse
|
27
|
T. Psota E, Schmidt T, Mote B, C. Pérez L. Long-Term Tracking of Group-Housed Livestock Using Keypoint Detection and MAP Estimation for Individual Animal Identification. SENSORS 2020; 20:s20133670. [PMID: 32630011 PMCID: PMC7374513 DOI: 10.3390/s20133670] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Revised: 06/08/2020] [Accepted: 06/16/2020] [Indexed: 02/05/2023]
Abstract
Tracking individual animals in a group setting is a exigent task for computer vision and animal science researchers. When the objective is months of uninterrupted tracking and the targeted animals lack discernible differences in their physical characteristics, this task introduces significant challenges. To address these challenges, a probabilistic tracking-by-detection method is proposed. The tracking method uses, as input, visible keypoints of individual animals provided by a fully-convolutional detector. Individual animals are also equipped with ear tags that are used by a classification network to assign unique identification to instances. The fixed cardinality of the targets is leveraged to create a continuous set of tracks and the forward-backward algorithm is used to assign ear-tag identification probabilities to each detected instance. Tracking achieves real-time performance on consumer-grade hardware, in part because it does not rely on complex, costly, graph-based optimizations. A publicly available, human-annotated dataset is introduced to evaluate tracking performance. This dataset contains 15 half-hour long videos of pigs with various ages/sizes, facility environments, and activity levels. Results demonstrate that the proposed method achieves an average precision and recall greater than 95% across the entire dataset. Analysis of the error events reveals environmental conditions and social interactions that are most likely to cause errors in real-world deployments.
Collapse
Affiliation(s)
- Eric T. Psota
- Department of Electrical and Computer Engineering, University of Nebraska–Lincoln, Lincoln, NE 68505, USA;
- Correspondence:
| | - Ty Schmidt
- Department of Animal Science, University of Nebraska–Lincoln, Lincoln, NE 68588, USA; (T.S.); (B.M.)
| | - Benny Mote
- Department of Animal Science, University of Nebraska–Lincoln, Lincoln, NE 68588, USA; (T.S.); (B.M.)
| | - Lance C. Pérez
- Department of Electrical and Computer Engineering, University of Nebraska–Lincoln, Lincoln, NE 68505, USA;
| |
Collapse
|
28
|
EmbeddedPigDet—Fast and Accurate Pig Detection for Embedded Board Implementations. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10082878] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Automated pig monitoring is an important issue in the surveillance environment of a pig farm. For a large-scale pig farm in particular, practical issues such as monitoring cost should be considered but such consideration based on low-cost embedded boards has not yet been reported. Since low-cost embedded boards have more limited computing power than typical PCs and have tradeoffs between execution speed and accuracy, achieving fast and accurate detection of individual pigs for “on-device” pig monitoring applications is very challenging. Therefore, in this paper, we propose a method for the fast detection of individual pigs by reducing the computational workload of 3 × 3 convolution in widely-used, deep learning-based object detectors. Then, in order to recover the accuracy of the “light-weight” deep learning-based object detector, we generate a three-channel composite image as its input image, through “simple” image preprocessing techniques. Our experimental results on an NVIDIA Jetson Nano embedded board show that the proposed method can improve the integrated performance of both execution speed and accuracy of widely-used, deep learning-based object detectors, by a factor of up to 8.7.
Collapse
|
29
|
El Moataz A, Mammass D, Mansouri A, Nouboud F. A Bottom-Up Approach for Pig Skeleton Extraction Using RGB Data. LECTURE NOTES IN COMPUTER SCIENCE 2020. [PMCID: PMC7340904 DOI: 10.1007/978-3-030-51935-3_6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Affiliation(s)
| | - Driss Mammass
- IRF-SIC, Faculty of Sciences, Ibn Zohr University, Agadir, Morocco
| | | | | |
Collapse
|
30
|
An Automatic Head Surface Temperature Extraction Method for Top-View Thermal Image with Individual Broiler. SENSORS 2019; 19:s19235286. [PMID: 31801282 PMCID: PMC6929031 DOI: 10.3390/s19235286] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2019] [Revised: 11/05/2019] [Accepted: 11/27/2019] [Indexed: 12/18/2022]
Abstract
Surface temperature variation in a broiler’s head can be used as an indicator of its health status. Surface temperatures in the existing thermograph based animal health assessment studies were mostly obtained manually. 2185 thermal images, each of which had an individual broiler, were captured from 20 broilers. Where 15 broilers served as the experimental group, they were injected with 0.1mL of pasteurella inoculum. The rest, 5 broilers, served as the control group. An algorithm was developed to extract head surface temperature automatically from the top-view broiler thermal image. Adaptive K-means clustering and ellipse fitting were applied to locate the broiler’s head region. The maximum temperature inside the head region was extracted as the head surface temperature. The developed algorithm was tested in Matlab® (R2016a) and the testing results indicated that the head region in 92.77% of the broiler thermal images could be located correctly. The maximum error of the extracted head surface temperatures was not greater than 0.1 °C. Different trend features were observed in the smoothed head surface temperature time series of the broilers in experimental and control groups. Head surface temperature extracted by the presented algorithm lays a foundation for the development of an automatic system for febrile broiler identification.
Collapse
|