1
|
Wutke M, Lensches C, Hartmann U, Traulsen I. Towards automatic farrowing monitoring-A Noisy Student approach for improving detection performance of newborn piglets. PLoS One 2024; 19:e0310818. [PMID: 39356687 PMCID: PMC11446433 DOI: 10.1371/journal.pone.0310818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2024] [Accepted: 09/07/2024] [Indexed: 10/04/2024] Open
Abstract
Nowadays, video monitoring of farrowing and automatic video evaluation using Deep Learning have become increasingly important in farm animal science research and open up new possibilities for addressing specific research questions like the determination of husbandry relevant indicators. A robust detection performance of newborn piglets is essential for reliably monitoring the farrowing process and to access important information about the welfare status of the sow and piglets. Although object detection algorithms are increasingly being used in various scenarios in the field of livestock farming, their usability for detecting newborn piglets has so far been limited. Challenges such as frequent animal occlusions, high overlapping rates or strong heterogeneous animal postures increase the complexity and place new demands on the detection model. Typically, new data is manually annotated to improve model performance, but the annotation effort is expensive and time-consuming. To address this problem, we propose a Noisy Student approach to automatically generate annotation information and train an improved piglet detection model. By using a teacher-student model relationship we transform the image structure and generate pseudo-labels for the object classes piglet and tail. As a result, we improve the initial detection performance of the teacher model from 0.561, 0.838, 0.672 to 0.901, 0.944, 0.922 for the performance metrics Recall, Precision and F1-score, respectively. The results of this study can be used in two ways. Firstly, the results contribute directly to the improvement of piglet detection in the context of birth monitoring systems and the evaluation of the farrowing progress. Secondly, the approach presented can be transferred to other research questions and species, thereby reducing the problem of cost-intensive annotation processes and increase training efficiency. In addition, we provide a unique dataset for the detection and evaluation of newborn piglets and sow body parts to support researchers in the task of monitoring the farrowing process.
Collapse
Affiliation(s)
- Martin Wutke
- Institute of Animal Breeding and Husbandry, Faculty of Agricultural and Nutritional Sciences, University of Kiel, Kiel, Germany
- Faculty of Agriculture, South Westphalia University of Applied Sciences, Soest, Germany
| | - Clara Lensches
- Department of Animal Sciences, Georg-August University, Göttingen, Germany
| | - Ulrich Hartmann
- Chamber of Agriculture Lower Saxony, Division Agriculture, Oldenburg, Germany
| | - Imke Traulsen
- Institute of Animal Breeding and Husbandry, Faculty of Agricultural and Nutritional Sciences, University of Kiel, Kiel, Germany
- Department of Animal Sciences, Georg-August University, Göttingen, Germany
| |
Collapse
|
2
|
Yang Q, Hui X, Huang Y, Chen M, Huang S, Xiao D. A Long-Term Video Tracking Method for Group-Housed Pigs. Animals (Basel) 2024; 14:1505. [PMID: 38791722 PMCID: PMC11117257 DOI: 10.3390/ani14101505] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Revised: 05/09/2024] [Accepted: 05/17/2024] [Indexed: 05/26/2024] Open
Abstract
Pig tracking provides strong support for refined management in pig farms. However, long and continuous multi-pig tracking is still extremely challenging due to occlusion, distortion, and motion blurring in real farming scenarios. This study proposes a long-term video tracking method for group-housed pigs based on improved StrongSORT, which can significantly improve the performance of pig tracking in production scenarios. In addition, this research constructs a 24 h pig tracking video dataset, providing a basis for exploring the effectiveness of long-term tracking algorithms. For object detection, a lightweight pig detection network, YOLO v7-tiny_Pig, improved based on YOLO v7-tiny, is proposed to reduce model parameters and improve detection speed. To address the target association problem, the trajectory management method of StrongSORT is optimized according to the characteristics of the pig tracking task to reduce the tracking identity (ID) switching and improve the stability of the algorithm. The experimental results show that YOLO v7-tiny_Pig ensures detection applicability while reducing parameters by 36.7% compared to YOLO v7-tiny and achieving an average video detection speed of 435 frames per second. In terms of pig tracking, Higher-Order Tracking Accuracy (HOTA), Multi-Object Tracking Accuracy (MOTP), and Identification F1 (IDF1) scores reach 83.16%, 97.6%, and 91.42%, respectively. Compared with the original StrongSORT algorithm, HOTA and IDF1 are improved by 6.19% and 10.89%, respectively, and Identity Switch (IDSW) is reduced by 69%. Our algorithm can achieve the continuous tracking of pigs in real scenarios for up to 24 h. This method provides technical support for non-contact pig automatic monitoring.
Collapse
Affiliation(s)
- Qiumei Yang
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (Q.Y.); (X.H.); (Y.H.); (M.C.); (S.H.)
- Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China
| | - Xiangyang Hui
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (Q.Y.); (X.H.); (Y.H.); (M.C.); (S.H.)
- Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China
| | - Yigui Huang
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (Q.Y.); (X.H.); (Y.H.); (M.C.); (S.H.)
- Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China
| | - Miaobin Chen
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (Q.Y.); (X.H.); (Y.H.); (M.C.); (S.H.)
- Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China
| | - Senpeng Huang
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (Q.Y.); (X.H.); (Y.H.); (M.C.); (S.H.)
- Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China
| | - Deqin Xiao
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (Q.Y.); (X.H.); (Y.H.); (M.C.); (S.H.)
- Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China
| |
Collapse
|
3
|
Mora M, Piles M, David I, Rosa GJM. Integrating computer vision algorithms and RFID system for identification and tracking of group-housed animals: an example with pigs. J Anim Sci 2024; 102:skae174. [PMID: 38908015 PMCID: PMC11245691 DOI: 10.1093/jas/skae174] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Accepted: 06/19/2024] [Indexed: 06/24/2024] Open
Abstract
Precision livestock farming aims to individually and automatically monitor animal activity to ensure their health, well-being, and productivity. Computer vision has emerged as a promising tool for this purpose. However, accurately tracking individuals using imaging remains challenging, especially in group housing where animals may have similar appearances. Close interaction or crowding among animals can lead to the loss or swapping of animal IDs, compromising tracking accuracy. To address this challenge, we implemented a framework combining a tracking-by-detection method with a radio frequency identification (RFID) system. We tested this approach using twelve pigs in a single pen as an illustrative example. Three of the pigs had distinctive natural coat markings, enabling their visual identification within the group. The remaining pigs either shared similar coat color patterns or were entirely white, making them visually indistinguishable from each other. We employed the latest version of the You Only Look Once (YOLOv8) and BoT-SORT algorithms for detection and tracking, respectively. YOLOv8 was fine-tuned with a dataset of 3,600 images to detect and classify different pig classes, achieving a mean average precision of all the classes of 99%. The fine-tuned YOLOv8 model and the tracker BoT-SORT were then applied to a 166.7-min video comprising 100,018 frames. Results showed that pigs with distinguishable coat color markings could be tracked 91% of the time on average. For pigs with similar coat color, the RFID system was used to identify individual animals when they entered the feeding station, and this RFID identification was linked to the image trajectory of each pig, both backward and forward. The two pigs with similar markings could be tracked for an average of 48.6 min, while the seven white pigs could be tracked for an average of 59.1 min. In all cases, the tracking time assigned to each pig matched the ground truth 90% of the time or more. Thus, our proposed framework enabled reliable tracking of group-housed pigs for extended periods, offering a promising alternative to the independent use of image or RFID approaches alone. This approach represents a significant step forward in combining multiple devices for animal identification, tracking, and traceability, particularly when homogeneous animals are kept in groups.
Collapse
Affiliation(s)
- Mónica Mora
- Institute of Agrifood Research and Technology (IRTA) – Animal Breeding and Genetics, Barcelona 08140, Spain
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Miriam Piles
- Institute of Agrifood Research and Technology (IRTA) – Animal Breeding and Genetics, Barcelona 08140, Spain
| | - Ingrid David
- GenPhySE, Université de Toulouse, INRAE, ENVT, Castanet Tolosan 31326, France
| | - Guilherme J M Rosa
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA
| |
Collapse
|
4
|
Doornweerd JE, Veerkamp RF, de Klerk B, van der Sluis M, Bouwman AC, Ellen ED, Kootstra G. Tracking individual broilers on video in terms of time and distance. Poult Sci 2024; 103:103185. [PMID: 37980741 PMCID: PMC10663953 DOI: 10.1016/j.psj.2023.103185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Revised: 10/06/2023] [Accepted: 10/06/2023] [Indexed: 11/21/2023] Open
Abstract
Tracking group-housed individual broilers using video can provide valuable information on their health, welfare, and performance, allowing breeders to identify novel or indicator traits that aid genetic improvement. However, their similar appearances make tracking individual broilers in a group-housed setting challenging. This study aimed to analyze broiler tracking on video (number of ID-switches, tracking time and distance) and examined potential tracking errors (ID-losses - location, proximity, kinematics) in an experimental pen to enable broiler locomotion phenotyping. This comprehensive analysis provided insights into the potential and challenges of tracking group-housed broilers on video with regards to phenotyping broiler locomotion. Thirty-nine broilers, of which 35 noncolor marked, were housed in an experimental pen (1.80 × 2.61 m), and only data at 18 d of age were used. A YOLOv7-tiny model was trained (n = 140), validated (n = 30), and tested (n = 30) on 200 annotated frames to detect the broilers. On the test set, YOLOv7-tiny had a precision, recall, and average precision (@0.5 - Intersection over Union threshold) of 0.99. A multi-object tracker (SORT) was implemented and evaluated on ground-truth trajectories of thirteen white broilers based on 136 min of video data (1-min intervals). The number of ID-switches varied from 5 to 20 (mean: 9.92) per ground-truth trajectory, tracking times ranged from 1 (by definition) to 51 min (mean: 12.36), and tracking distances ranged from 0.01 to 17.07 meters (mean: 1.89) per tracklet. Tracking errors primarily occurred when broilers were occluded by the drinker, and relatively frequently when broilers were in close proximity (within 10 cm), with velocity and acceleration appearing to have a lesser impact on tracking errors. The study establishes a 'baseline' for future research and identified the potential and challenges of tracking group-housed individual broilers. The results highlighted the importance of addressing ID-switches, identified potential tracking algorithm improvements, and emphasized the need for an external animal identification system to enable objective, simultaneous and semi-continuous locomotion phenotyping of group-housed individual broilers.
Collapse
Affiliation(s)
- J E Doornweerd
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands.
| | - R F Veerkamp
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - B de Klerk
- Research & Development, Cobb Europe BV, 5831 GH Boxmeer, the Netherlands
| | - M van der Sluis
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - A C Bouwman
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - E D Ellen
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - G Kootstra
- Farm Technology, Wageningen University & Research, 6700 AA Wageningen, the Netherlands
| |
Collapse
|
5
|
Hao W, Zhang L, Han M, Zhang K, Li F, Yang G, Liu Z. YOLOv5-SA-FC: A Novel Pig Detection and Counting Method Based on Shuffle Attention and Focal Complete Intersection over Union. Animals (Basel) 2023; 13:3201. [PMID: 37893925 PMCID: PMC10603737 DOI: 10.3390/ani13203201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Revised: 09/30/2023] [Accepted: 10/09/2023] [Indexed: 10/29/2023] Open
Abstract
The efficient detection and counting of pig populations is critical for the promotion of intelligent breeding. Traditional methods for pig detection and counting mainly rely on manual labor, which is either time-consuming and inefficient or lacks sufficient detection accuracy. To address these issues, a novel model for pig detection and counting based on YOLOv5 enhanced with shuffle attention (SA) and Focal-CIoU (FC) is proposed in this paper, which we call YOLOv5-SA-FC. The SA attention module in this model enables multi-channel information fusion with almost no additional parameters, enhancing the richness and robustness of feature extraction. Furthermore, the Focal-CIoU localization loss helps to reduce the impact of sample imbalance on the detection results, improving the overall performance of the model. From the experimental results, the proposed YOLOv5-SA-FC model achieved a mean average precision (mAP) and count accuracy of 93.8% and 95.6%, outperforming other methods in terms of pig detection and counting by 10.2% and 15.8%, respectively. These findings verify the effectiveness of the proposed YOLOv5-SA-FC model for pig population detection and counting in the context of intelligent pig breeding.
Collapse
Affiliation(s)
| | | | | | | | | | - Guoqiang Yang
- School of Software, Shanxi Agricultural University, Jingzhong 030801, China; (W.H.); (L.Z.); (M.H.); (K.Z.); (F.L.); (Z.L.)
| | | |
Collapse
|
6
|
Kapun A, Adrion F, Gallmann E. Evaluating the Activity of Pigs with Radio-Frequency Identification and Virtual Walking Distances. Animals (Basel) 2023; 13:3112. [PMID: 37835719 PMCID: PMC10571748 DOI: 10.3390/ani13193112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2023] [Revised: 09/28/2023] [Accepted: 09/29/2023] [Indexed: 10/15/2023] Open
Abstract
Monitoring the activity of animals can help with assessing their health status. We monitored the walking activity of fattening pigs using a UHF-RFID system. Four hundred fattening pigs with UHF-RFID ear tags were recorded by RFID antennas at the troughs, playing devices and drinkers during the fattening period. A minimum walking distance, or virtual walking distance, was determined for each pig per day by calculating the distances between two consecutive reading areas. This automatically calculated value was used as an activity measure and not only showed differences between the pigs but also between different fattening stages. The longer the fattening periods lasted, the less walking activity was detected. The virtual walking distance ranged between 281 m on average in the first fattening stage and about 141 m in the last fattening stage in a restricted environment. The findings are similar to other studies considering walking distances of fattening pigs, but are far less labor-intensive and time-consuming than direct observations.
Collapse
Affiliation(s)
- Anita Kapun
- Institute of Agricultural Engineering, University of Hohenheim, Garbenstraße 9, 70599 Stuttgart, Germany; (F.A.); (E.G.)
| | | | | |
Collapse
|
7
|
Implementation of Computer-Vision-Based Farrowing Prediction in Pens with Temporary Sow Confinement. Vet Sci 2023; 10:vetsci10020109. [PMID: 36851413 PMCID: PMC9966211 DOI: 10.3390/vetsci10020109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Revised: 01/24/2023] [Accepted: 01/30/2023] [Indexed: 02/05/2023] Open
Abstract
The adoption of temporary sow confinement could improve animal welfare during farrowing for both the sow and the piglets. An important challenge related to the implementation of temporary sow confinement is the optimal timing of confinement in crates, considering sow welfare and piglet survival. The objective of this study was to predict farrowing with computer vision techniques to optimize the timing of sow confinement. In total, 71 Austrian Large White and Landrace × Large White crossbred sows and four types of farrowing pens were included in the observational study. We applied computer vision model You Only Look Once X to detect sow locations, the calculated activity level of sows based on detected locations and detected changes in sow activity trends with Kalman filtering and the fixed interval smoothing algorithm. The results indicated the beginning of nest-building behavior with a median of 12 h 51 min and ending with a median of 2 h 38 min before the beginning of farrowing with the YOLOX-large object detection model. It was possible to predict farrowing for 29 out of 44 sows. The developed method could reduce labor costs otherwise required for the regular control of sows in farrowing compartments.
Collapse
|
8
|
WATB: Wild Animal Tracking Benchmark. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01732-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|
9
|
Instance segmentation and tracking of animals in wildlife videos: SWIFT - segmentation with filtering of tracklets. ECOL INFORM 2022. [DOI: 10.1016/j.ecoinf.2022.101794] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
10
|
Wang S, Jiang H, Qiao Y, Jiang S, Lin H, Sun Q. The Research Progress of Vision-Based Artificial Intelligence in Smart Pig Farming. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22176541. [PMID: 36080994 PMCID: PMC9460267 DOI: 10.3390/s22176541] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Revised: 08/22/2022] [Accepted: 08/27/2022] [Indexed: 05/05/2023]
Abstract
Pork accounts for an important proportion of livestock products. For pig farming, a lot of manpower, material resources and time are required to monitor pig health and welfare. As the number of pigs in farming increases, the continued use of traditional monitoring methods may cause stress and harm to pigs and farmers and affect pig health and welfare as well as farming economic output. In addition, the application of artificial intelligence has become a core part of smart pig farming. The precision pig farming system uses sensors such as cameras and radio frequency identification to monitor biometric information such as pig sound and pig behavior in real-time and convert them into key indicators of pig health and welfare. By analyzing the key indicators, problems in pig health and welfare can be detected early, and timely intervention and treatment can be provided, which helps to improve the production and economic efficiency of pig farming. This paper studies more than 150 papers on precision pig farming and summarizes and evaluates the application of artificial intelligence technologies to pig detection, tracking, behavior recognition and sound recognition. Finally, we summarize and discuss the opportunities and challenges of precision pig farming.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Yongliang Qiao
- Australian Centre for Field Robotics (ACFR), Faculty of Engineering, The University of Sydney, Sydney, NSW 2006, Australia
- Correspondence:
| | - Shuzhen Jiang
- College of Animal Science and Veterinary Medicine, Shandong Agricultural University, Tai’an 271018, China
| | - Huaiqin Lin
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Qian Sun
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| |
Collapse
|
11
|
van der Zande LE, Guzhva O, Parois S, van de Leemput IA, Bolhuis JE, Rodenburg TB. Estimation of Resilience Parameters Following LPS Injection Based on Activity Measured With Computer Vision. FRONTIERS IN ANIMAL SCIENCE 2022. [DOI: 10.3389/fanim.2022.883940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Resilience could be referred to as the animal’s ability to successfully adapt to a challenge. This is typically displayed by a quick return to initial metabolic or activity levels and behaviors. Pigs have distinct diurnal activity patterns. Deviations from these patterns could potentially be utilized to quantify resilience. However, human observations of activity are labor intensive and not feasible in practice on a large scale. In this study, we show the use of a computer vision tracking algorithm to quantify resilience based on activity individual patterns following a lipopolysaccharide (LPS) challenge, which induced a sickness response. We followed 121 individual pigs housed in barren or enriched housing systems, as previous work suggests an impact of housing on resilience, for eight days. The enriched housing consisted of delayed weaning in a group farrowing system and extra space compared with the barren pens and environmental enrichment. Enriched housed pigs were more active pre-injection of LPS, especially during peak activity times, than barren housed pigs (49.4 ± 9.9 vs. 39.1 ± 5.0 meter/hour). Four pigs per pen received an LPS injection and two pigs a saline injection. LPS injected animals were more likely to show a dip in activity than controls (86% vs 17%). Duration and Area Under the Curve (AUC) of the dip were not affected by housing. However, pigs with the same AUC could have a long and shallow dip or a steep and short dip. Therefore the AUC:duration ratio was calculated, and enriched housed pigs had a higher AUC:duration ratio compared to barren housed pigs (9244.1 ± 5429.8 vs 5919.6 ± 4566.1). Enriched housed pigs might therefore have a different strategy to cope with an LPS sickness challenge. However, more research on this strategy and the use of activity to quantify resilience and its relationship to physiological parameters is therefore needed.
Collapse
|
12
|
Giersberg MF, Meijboom FLB. Caught on Camera: On the Need of Responsible Use of Video Observation for Animal Behavior and Welfare Research. Front Vet Sci 2022; 9:864677. [PMID: 35548048 PMCID: PMC9082409 DOI: 10.3389/fvets.2022.864677] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 03/28/2022] [Indexed: 12/02/2022] Open
Abstract
Video analysis is a popular and frequently used tool in animal behavior and welfare research. In addition to the actual object of research, video recordings often provide unforeseen information about the progress of the study, the animals or the people involved. Conflicts can arise when this information is weighed against the original intention of the recordings and broader social expectations. Uncertainty may prevent the video observers, often less experienced researchers, to properly address these conflicts, which can pose a threat to animal welfare and research quality and integrity. In this article, we aim to raise awareness of the interrelationship of variables characteristic for video-based animal studies and the potential conflicts emerging from this. We propose stepping stones for a framework which enables a culture of openness in dealing with unexpected and unintended events observed during video analysis. As a basis, a frame of reference regarding privacy and duty of care toward animals should be created and shared with all persons involved. At this stage, expectations and responsibilities need to be made explicit. During running and reporting of the study, the risk of animal welfare and research integrity issues can be mitigated by making conflicts discussible and offering realistic opportunities on how to deal with them. A practice which is outlined and guided by conversation will prevent a mere compliance-based approach centered on checklists and decision trees. Based on these stepping stones, educational material can be produced to foster reflection, co-creation and application of ethical practice.
Collapse
|
13
|
Bonneau M, Godard X, Bambou JC. Assessing Goats' Fecal Avoidance Using Image Analysis-Based Monitoring. FRONTIERS IN ANIMAL SCIENCE 2022. [DOI: 10.3389/fanim.2022.835516] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
The recent advances in sensor technologies and data analysis could improve our capacity to acquire long-term and individual dataset on animal behavior. In livestock management, this is particularly interesting when behavioral data could be linked to production performances, physiological or genetical information, with the objective of improving animal health and welfare management. In this study, we proposed a framework, based on computer vision and deep learning, to automatically estimate animal location within pasture and discuss the relationship with the risk of gastrointestinal nematode (GIN) infection. We illustrated our framework for the monitoring of goats allowed to graze an experimental plot, where feces containing GIN infective larvae were previously dropped in delimited areas. Four animals were monitored, during two grazing weeks on the same pasture (week 1 from April 12 to 19, 2021 and week 2, from June 28 to July 5, 2021). Using the monitoring framework, different components of animal behavior were analyzed, and the relationship with the risk of GIN infection was explored. First, in average, 87.95% of the goats were detected, the detected individuals were identified with an average sensitivity of 94.9%, and an average precision of 94.8%. Second, the monitoring of the ability of the animal to avoid infected feces on pasture showed an important temporal and individual variability. Interestingly, the avoidance behavior of 3 animals increased during the second grazing week (Wilcoxon rank sum, p-value < 0.05), and the level of increase was correlated with the level of infection during week 1 (Pearson's correlation coefficient = 0.9). The relationship between the time spent on GIN-infested areas and the level of infection was also studied, but no clear relationship was found. In conclusion, due to the low number of studied animals, biological results should be interpreted with caution; nevertheless, the framework provided here is a new relevant tool to explore the relationship between ruminant behavior and GIN parasitism in experimental studies.
Collapse
|
14
|
Hansen MF, Oparaeke A, Gallagher R, Karimi A, Tariq F, Smith ML. Towards Machine Vision for Insect Welfare Monitoring and Behavioural Insights. Front Vet Sci 2022; 9:835529. [PMID: 35242842 PMCID: PMC8886630 DOI: 10.3389/fvets.2022.835529] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 01/21/2022] [Indexed: 11/13/2022] Open
Abstract
Machine vision has demonstrated its usefulness in the livestock industry in terms of improving welfare in such areas as lameness detection and body condition scoring in dairy cattle. In this article, we present some promising results of applying state of the art object detection and classification techniques to insects, specifically Black Soldier Fly (BSF) and the domestic cricket, with the view of enabling automated processing for insect farming. We also present the low-cost “Insecto” Internet of Things (IoT) device, which provides environmental condition monitoring for temperature, humidity, CO2, air pressure, and volatile organic compound levels together with high resolution image capture. We show that we are able to accurately count and measure size of BSF larvae and also classify the sex of domestic crickets by detecting the presence of the ovipositor. These early results point to future work for enabling automation in the selection of desirable phenotypes for subsequent generations and for providing early alerts should environmental conditions deviate from desired values.
Collapse
Affiliation(s)
- Mark F. Hansen
- The Centre for Machine Vision, Bristol Robotics Laboratory, UWE Bristol, Bristol, United Kingdom
- *Correspondence: Mark F. Hansen
| | | | - Ryan Gallagher
- The Centre for Machine Vision, Bristol Robotics Laboratory, UWE Bristol, Bristol, United Kingdom
| | - Amir Karimi
- The Centre for Machine Vision, Bristol Robotics Laboratory, UWE Bristol, Bristol, United Kingdom
| | | | - Melvyn L. Smith
- The Centre for Machine Vision, Bristol Robotics Laboratory, UWE Bristol, Bristol, United Kingdom
| |
Collapse
|
15
|
Bonneau M, Poullet N, Beramice D, Dantec L, Canario L, Gourdine JL. Behavior Comparison During Chronic Heat Stress in Large White and Creole Pigs Using Image-Analysis. FRONTIERS IN ANIMAL SCIENCE 2021. [DOI: 10.3389/fanim.2021.784376] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Behavior is a good indicator of animal welfare, especially in challenging environments. However, few studies have investigated how pig behavior changes during heat stress. The current study is a proof-of-concept using Convolutional Neural Network (CNN) models to monitor pig behavior in order to investigate the differences in behavioral response to heat stress of two contrasted breeds: Large White (LW), selected for high performance, and Creole (CR), adapted to tropical conditions. A total of 6 slaughter pigs (3 CR and 3 LW; 22 weeks of age) were monitored from 8:30 to 17:30 during 54 days. Two CNN architectures were used to detect the animal (Yolo v2) and to estimate animal's posture (GoogleNet). Pig postures estimated by the neural network showed that pigs spent more time lying on their side when temperature increased. When comparing the two breeds, as temperature increases, CR pigs spent more time lying on their side than LW pigs, suggesting that they use this posture to increase thermoregulation and dissipate heat more efficiently. This study demonstrates that neural network models are an efficient tool to monitor animal behavior in an automated way, which could be particularly relevant to characterize breed adaptation to challenging environments.
Collapse
|
16
|
Qiao Y, Clark C, Lomax S, Kong H, Su D, Sukkarieh S. Automated Individual Cattle Identification Using Video Data: A Unified Deep Learning Architecture Approach. FRONTIERS IN ANIMAL SCIENCE 2021. [DOI: 10.3389/fanim.2021.759147] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Individual cattle identification is a prerequisite and foundation for precision livestock farming. Existing methods for cattle identification require radio frequency or visual ear tags, all of which are prone to loss or damage. Here, we propose and implement a new unified deep learning approach to cattle identification using video analysis. The proposed deep learning framework is composed of a Convolutional Neural Network (CNN) and Bidirectional Long Short-Term Memory (BiLSTM) with a self-attention mechanism. More specifically, the Inception-V3 CNN was used to extract features from a cattle video dataset taken in a feedlot with rear-view. Extracted features were then fed to a BiLSTM layer to capture spatio-temporal information. Then, self-attention was employed to provide a different focus on the features captured by BiLSTM for the final step of cattle identification. We used a total of 363 rear-view videos from 50 cattle at three different times with an interval of 1 month between data collection periods. The proposed method achieved 93.3% identification accuracy using a 30-frame video length, which outperformed current state-of-the-art methods (Inception-V3, MLP, SimpleRNN, LSTM, and BiLSTM). Furthermore, two different attention schemes, namely, additive and multiplicative attention mechanisms were compared. Our results show that the additive attention mechanism achieved 93.3% accuracy and 91.0% recall, greater than multiplicative attention mechanism with 90.7% accuracy and 87.0% recall. Video length also impacted accuracy, with video sequence length up to 30-frames enhancing identification performance. Overall, our approach can capture key spatio-temporal features to improve cattle identification accuracy, enabling automated cattle identification for precision livestock farming.
Collapse
|
17
|
Detecting Animal Contacts-A Deep Learning-Based Pig Detection and Tracking Approach for the Quantification of Social Contacts. SENSORS 2021; 21:s21227512. [PMID: 34833588 PMCID: PMC8619108 DOI: 10.3390/s21227512] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Revised: 11/02/2021] [Accepted: 11/10/2021] [Indexed: 11/25/2022]
Abstract
The identification of social interactions is of fundamental importance for animal behavioral studies, addressing numerous problems like investigating the influence of social hierarchical structures or the drivers of agonistic behavioral disorders. However, the majority of previous studies often rely on manual determination of the number and types of social encounters by direct observation which requires a large amount of personnel and economical efforts. To overcome this limitation and increase research efficiency and, thus, contribute to animal welfare in the long term, we propose in this study a framework for the automated identification of social contacts. In this framework, we apply a convolutional neural network (CNN) to detect the location and orientation of pigs within a video and track their movement trajectories over a period of time using a Kalman filter (KF) algorithm. Based on the tracking information, we automatically identify social contacts in the form of head–head and head–tail contacts. Moreover, by using the individual animal IDs, we construct a network of social contacts as the final output. We evaluated the performance of our framework based on two distinct test sets for pig detection and tracking. Consequently, we achieved a Sensitivity, Precision, and F1-score of 94.2%, 95.4%, and 95.1%, respectively, and a MOTA score of 94.4%. The findings of this study demonstrate the effectiveness of our keypoint-based tracking-by-detection strategy and can be applied to enhance animal monitoring systems.
Collapse
|
18
|
The Application of Cameras in Precision Pig Farming: An Overview for Swine-Keeping Professionals. Animals (Basel) 2021; 11:ani11082343. [PMID: 34438800 PMCID: PMC8388688 DOI: 10.3390/ani11082343] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Revised: 07/19/2021] [Accepted: 08/06/2021] [Indexed: 01/06/2023] Open
Abstract
Simple Summary The preeminent purpose of precision livestock farming (PLF) is to provide affordable and straightforward solutions to severe problems with certainty. Some data collection techniques in PLF such as RFID are accurate but not affordable for small- and medium-sized farms. On the other hand, camera sensors are cheap, commonly available, and easily used to collect information compared to other sensor systems in precision pig farming. Cameras have ample chance to monitor pigs with high precision at an affordable cost. However, the lack of targeted information about the application of cameras in the pig industry is a shortcoming for swine farmers and researchers. This review describes the state of the art in 3D imaging systems (i.e., depth sensors and time-of-flight cameras), along with 2D cameras, for effectively identifying pig behaviors, and presents automated approaches for monitoring and investigating pigs’ feeding, drinking, lying, locomotion, aggressive, and reproductive behaviors. In addition, the review summarizes the related literature and points out limitations to open up new dimensions for future researchers to explore. Abstract Pork is the meat with the second-largest overall consumption, and chicken, pork, and beef together account for 92% of global meat production. Therefore, it is necessary to adopt more progressive methodologies such as precision livestock farming (PLF) rather than conventional methods to improve production. In recent years, image-based studies have become an efficient solution in various fields such as navigation for unmanned vehicles, human–machine-based systems, agricultural surveying, livestock, etc. So far, several studies have been conducted to identify, track, and classify the behaviors of pigs and achieve early detection of disease, using 2D/3D cameras. This review describes the state of the art in 3D imaging systems (i.e., depth sensors and time-of-flight cameras), along with 2D cameras, for effectively identifying pig behaviors and presents automated approaches for the monitoring and investigation of pigs’ feeding, drinking, lying, locomotion, aggressive, and reproductive behaviors.
Collapse
|