1
|
García-Vázquez FA. Artificial intelligence and porcine breeding. Anim Reprod Sci 2024; 269:107538. [PMID: 38926001 DOI: 10.1016/j.anireprosci.2024.107538] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2024] [Revised: 06/13/2024] [Accepted: 06/14/2024] [Indexed: 06/28/2024]
Abstract
Livestock management is evolving into a new era, characterized by the analysis of vast quantities of data (Big Data) collected from both traditional breeding methods and new technologies such as sensors, automated monitoring system, and advanced analytics. Artificial intelligence (A-In), which refers to the capability of machines to mimic human intelligence, including subfields like machine learning and deep learning, is playing a pivotal role in this transformation. A wide array of A-In techniques, successfully employed in various industrial and scientific contexts, are now being integrated into mainstream livestock management practices. In the case of swine breeding, while traditional methods have yielded considerable success, the increasing amount of information requires the adoption of new technologies such as A-In to drive productivity, enhance animal welfare, and reduce environmental impact. Current findings suggest that these techniques have the potential to match or exceed the performance of traditional methods, often being more scalable in terms of efficiency and sustainability within the breeding industry. This review provides insights into the application of A-In in porcine breeding, from the perspectives of both sows (including welfare and reproductive management) and boars (including semen quality and health), and explores new approaches which are already being applied in other species.
Collapse
Affiliation(s)
- Francisco A García-Vázquez
- Departamento de Fisiología, Facultad de Veterinaria, Campus de Excelencia Mare Nostrum, Universidad de Murcia, Murcia 30100, Spain; Instituto Murciano de Investigación Biosanitaria (IMIB-Arrixaca), Murcia, Spain.
| |
Collapse
|
2
|
Sonalio K, Boyen F, Devriendt B, Chantziaras I, Beuckelaere L, Biebaut E, Haesebrouck F, Santamarta I, de Oliveira LG, Maes D. Rationalizing the use of common parameters and technological tools to follow up Mycoplasma hyopneumoniae infections in pigs. Porcine Health Manag 2024; 10:31. [PMID: 39180129 PMCID: PMC11342468 DOI: 10.1186/s40813-024-00381-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Accepted: 08/05/2024] [Indexed: 08/26/2024] Open
Abstract
BACKGROUND Mycoplasma (M.) hyopneumoniae is associated with respiratory disease in pigs and is the primary agent of enzootic pneumonia. Quantification of M. hyopneumoniae-related outcome parameters can be difficult, expensive, and time-consuming, in both research and field settings. In addition to well-established methods, technological tools are becoming available to monitor various aspects of relevant animal- and environment-related features, often in real-time. Therefore, this study aimed to assess whether certain parameters, such as animal movement and body temperature using microchips (IMT), correlate with established parameters and whether the currently used parameters can be rationalized. RESULTS The percentage of movement was significantly reduced by M. hyopneumoniae infection in pigs (p < 0.05), where the M. hyopneumoniae-infected group showed a lower percentage of movement (1.9%) when compared to the negative control group (6.9%). On the other hand, macroscopic (MLCL) and microscopic (MLL) lung lesions, respiratory disease score (RDS), M. hyopneumoniae-DNA load, and anti-M. hyopneumoniae antibody levels increased significantly in the M. hyopneumoniae-infected group 28 days post-inoculation (p < 0.05). Moderate (r > 0.30) to very strong correlations (> 0.80) were observed between the abovementioned parameters (p < 0.05), except for IMT. A significant and moderate correlation was reported between IMT and rectal temperature (r = 0.49; p < 0.05). Last, the average daily weight gain and the percentage of air in the lung were not affected by M. hyopneumoniae infection (p > 0.05). CONCLUSIONS M. hyopneumoniae infection significantly reduced the movement of piglets and increased lung lesions, M. hyopneumoniae-DNA load, and anti-M. hyopneumoniae antibody levels; and, good correlations were observed between most parameters, indicating a direct relationship between them. Thus, we suggest that changes in movement might be a reliable indicator of M. hyopneumoniae infection in pigs, and that a selected group of parameters-specifically RDS, MLCL, MLL, M. hyopneumoniae-DNA load, anti-M. hyopneumoniae antibody levels, and movement-are optimal to assess M. hyopneumoniae infection under experimental conditions.
Collapse
Affiliation(s)
- Karina Sonalio
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium.
- Department of Veterinary Clinic and Surgery, School of Agricultural and Veterinarian Sciences, São Paulo State University (Unesp), Jaboticabal, Brazil.
| | - Filip Boyen
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Bert Devriendt
- Department of Translational Physiology, Infectiology and Public Health, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Ilias Chantziaras
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Lisa Beuckelaere
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Evelien Biebaut
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Freddy Haesebrouck
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | | | - Luís Guilherme de Oliveira
- Department of Veterinary Clinic and Surgery, School of Agricultural and Veterinarian Sciences, São Paulo State University (Unesp), Jaboticabal, Brazil
| | - Dominiek Maes
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| |
Collapse
|
3
|
Knoll M, Gygax L, Hillmann E. Pinpointing pigs: performance and challenges of an ultra-wideband real-time location system for tracking growing-finishing pigs under practical conditions. Animal 2024; 18:101163. [PMID: 38744229 DOI: 10.1016/j.animal.2024.101163] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Revised: 04/09/2024] [Accepted: 04/09/2024] [Indexed: 05/16/2024] Open
Abstract
Real-Time Location Systems (RTLSs) are promising precision livestock farming tools and have been employed in behavioural studies across various farm animal species. However, their application in research with fattening pigs is so far unexplored. The implementation of these systems has great potential to gain insight into pigs' spatial behaviour such as the use of functional areas and pigs' proximity to each other as indicators for social relationships. The aim of this study was therefore to validate the accuracy, precision, and data quality of the commercial Noldus Information Technology BV TrackLab system. We conducted different measurement sets: first, we performed static measurements in 12 pens at four different locations in each pen at three heights each using a single ultra-wideband tag (UWB). We recorded unfiltered x- and y-coordinates at 1 Hz. We repeated these measurements with six tags aligned in a 2 × 3 grid with varied spacing to test interference between the tags. We also tested dynamic performance by moving the tags along the centre line of the pens. Finally, we measured the data quality with 55 growing pigs in six pens, including the identification of location 'jumps' from the inside to the outside of the pen. Each pen housed ten animals fitted with a UWB tag attached to their farm ear tag. We collected data for 10 days and analysed seven 24-h periods of raw and filtered data. The mean accuracy of the RTLS measurements was 0.53 m (precision: 0.14 m) for single and 0.46 m (precision: 0.07 m) for grouped tags. Accuracy improved with increasing measurement height for single tags but less clearly for grouped tags (P [height single] = 0.01; P [height grouped] = 0.22). When tags were fitted to animals, 63.3% of the filtered data was lost and 21.8% of the filtered location estimates were outside the pens. Altogether, the TrackLab system was capable of fairly accurate and precise assignment of the functional areas where individual animals were located, but was insufficient for the analysis of social relationships. Furthermore, the frequent occurrence of gaps in signal transmission and the overall high data loss rates presented significant limitations. Additionally, the challenging hardware requirements for attaching sensors to the animals underline the need for further technological advances in RTLS for the application with growing-finishing pigs.
Collapse
Affiliation(s)
- M Knoll
- Humboldt-Universität zu Berlin, Department of Life Sciences, Albrecht Daniel Thaer Institute of Agricultural and Horticultural Sciences, Animal Husbandry and Ethology, Unter den Linden 6, 10099 Berlin, Germany.
| | - L Gygax
- Humboldt-Universität zu Berlin, Department of Life Sciences, Albrecht Daniel Thaer Institute of Agricultural and Horticultural Sciences, Animal Husbandry and Ethology, Unter den Linden 6, 10099 Berlin, Germany
| | - E Hillmann
- Humboldt-Universität zu Berlin, Department of Life Sciences, Albrecht Daniel Thaer Institute of Agricultural and Horticultural Sciences, Animal Husbandry and Ethology, Unter den Linden 6, 10099 Berlin, Germany
| |
Collapse
|
4
|
Sharifuzzaman M, Mun HS, Ampode KMB, Lagua EB, Park HR, Kim YH, Hasan MK, Yang CJ. Technological Tools and Artificial Intelligence in Estrus Detection of Sows-A Comprehensive Review. Animals (Basel) 2024; 14:471. [PMID: 38338113 PMCID: PMC10854728 DOI: 10.3390/ani14030471] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2023] [Revised: 01/30/2024] [Accepted: 01/30/2024] [Indexed: 02/12/2024] Open
Abstract
In animal farming, timely estrus detection and prediction of the best moment for insemination is crucial. Traditional sow estrus detection depends on the expertise of a farm attendant which can be inconsistent, time-consuming, and labor-intensive. Attempts and trials in developing and implementing technological tools to detect estrus have been explored by researchers. The objective of this review is to assess the automatic methods of estrus recognition in operation for sows and point out their strong and weak points to assist in developing new and improved detection systems. Real-time methods using body and vulvar temperature, posture recognition, and activity measurements show higher precision. Incorporating artificial intelligence with multiple estrus-related parameters is expected to enhance accuracy. Further development of new systems relies mostly upon the improved algorithm and accurate data provided. Future systems should be designed to minimize the misclassification rate, so better detection is achieved.
Collapse
Affiliation(s)
- Md Sharifuzzaman
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Animal Science and Veterinary Medicine, Bangabandhu Sheikh Mujibur Rahman Science and Technology University, Gopalganj 8100, Bangladesh
| | - Hong-Seok Mun
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Multimedia Engineering, Sunchon National University, Suncheon 57922, Republic of Korea
| | - Keiven Mark B. Ampode
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Animal Science, College of Agriculture, Sultan Kudarat State University, Tacurong 9800, Philippines
| | - Eddiemar B. Lagua
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Sunchon National University, Suncheon 57922, Republic of Korea
| | - Hae-Rang Park
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Sunchon National University, Suncheon 57922, Republic of Korea
| | - Young-Hwa Kim
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Chonnam National University, Gwangju 61186, Republic of Korea;
| | - Md Kamrul Hasan
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Poultry Science, Sylhet Agricultural University, Sylhet 3100, Bangladesh
| | - Chul-Ju Yang
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Sunchon National University, Suncheon 57922, Republic of Korea
| |
Collapse
|
5
|
Mora M, Piles M, David I, Rosa GJM. Integrating computer vision algorithms and RFID system for identification and tracking of group-housed animals: an example with pigs. J Anim Sci 2024; 102:skae174. [PMID: 38908015 PMCID: PMC11245691 DOI: 10.1093/jas/skae174] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Accepted: 06/19/2024] [Indexed: 06/24/2024] Open
Abstract
Precision livestock farming aims to individually and automatically monitor animal activity to ensure their health, well-being, and productivity. Computer vision has emerged as a promising tool for this purpose. However, accurately tracking individuals using imaging remains challenging, especially in group housing where animals may have similar appearances. Close interaction or crowding among animals can lead to the loss or swapping of animal IDs, compromising tracking accuracy. To address this challenge, we implemented a framework combining a tracking-by-detection method with a radio frequency identification (RFID) system. We tested this approach using twelve pigs in a single pen as an illustrative example. Three of the pigs had distinctive natural coat markings, enabling their visual identification within the group. The remaining pigs either shared similar coat color patterns or were entirely white, making them visually indistinguishable from each other. We employed the latest version of the You Only Look Once (YOLOv8) and BoT-SORT algorithms for detection and tracking, respectively. YOLOv8 was fine-tuned with a dataset of 3,600 images to detect and classify different pig classes, achieving a mean average precision of all the classes of 99%. The fine-tuned YOLOv8 model and the tracker BoT-SORT were then applied to a 166.7-min video comprising 100,018 frames. Results showed that pigs with distinguishable coat color markings could be tracked 91% of the time on average. For pigs with similar coat color, the RFID system was used to identify individual animals when they entered the feeding station, and this RFID identification was linked to the image trajectory of each pig, both backward and forward. The two pigs with similar markings could be tracked for an average of 48.6 min, while the seven white pigs could be tracked for an average of 59.1 min. In all cases, the tracking time assigned to each pig matched the ground truth 90% of the time or more. Thus, our proposed framework enabled reliable tracking of group-housed pigs for extended periods, offering a promising alternative to the independent use of image or RFID approaches alone. This approach represents a significant step forward in combining multiple devices for animal identification, tracking, and traceability, particularly when homogeneous animals are kept in groups.
Collapse
Affiliation(s)
- Mónica Mora
- Institute of Agrifood Research and Technology (IRTA) – Animal Breeding and Genetics, Barcelona 08140, Spain
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Miriam Piles
- Institute of Agrifood Research and Technology (IRTA) – Animal Breeding and Genetics, Barcelona 08140, Spain
| | - Ingrid David
- GenPhySE, Université de Toulouse, INRAE, ENVT, Castanet Tolosan 31326, France
| | - Guilherme J M Rosa
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA
| |
Collapse
|
6
|
Reza MN, Ali MR, Samsuzzaman, Kabir MSN, Karim MR, Ahmed S, Kyoung H, Kim G, Chung SO. Thermal imaging and computer vision technologies for the enhancement of pig husbandry: a review. JOURNAL OF ANIMAL SCIENCE AND TECHNOLOGY 2024; 66:31-56. [PMID: 38618025 PMCID: PMC11007457 DOI: 10.5187/jast.2024.e4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 01/03/2024] [Accepted: 01/03/2024] [Indexed: 04/16/2024]
Abstract
Pig farming, a vital industry, necessitates proactive measures for early disease detection and crush symptom monitoring to ensure optimum pig health and safety. This review explores advanced thermal sensing technologies and computer vision-based thermal imaging techniques employed for pig disease and piglet crush symptom monitoring on pig farms. Infrared thermography (IRT) is a non-invasive and efficient technology for measuring pig body temperature, providing advantages such as non-destructive, long-distance, and high-sensitivity measurements. Unlike traditional methods, IRT offers a quick and labor-saving approach to acquiring physiological data impacted by environmental temperature, crucial for understanding pig body physiology and metabolism. IRT aids in early disease detection, respiratory health monitoring, and evaluating vaccination effectiveness. Challenges include body surface emissivity variations affecting measurement accuracy. Thermal imaging and deep learning algorithms are used for pig behavior recognition, with the dorsal plane effective for stress detection. Remote health monitoring through thermal imaging, deep learning, and wearable devices facilitates non-invasive assessment of pig health, minimizing medication use. Integration of advanced sensors, thermal imaging, and deep learning shows potential for disease detection and improvement in pig farming, but challenges and ethical considerations must be addressed for successful implementation. This review summarizes the state-of-the-art technologies used in the pig farming industry, including computer vision algorithms such as object detection, image segmentation, and deep learning techniques. It also discusses the benefits and limitations of IRT technology, providing an overview of the current research field. This study provides valuable insights for researchers and farmers regarding IRT application in pig production, highlighting notable approaches and the latest research findings in this field.
Collapse
Affiliation(s)
- Md Nasim Reza
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Md Razob Ali
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Samsuzzaman
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Md Shaha Nur Kabir
- Department of Agricultural Industrial
Engineering, Faculty of Engineering, Hajee Mohammad Danesh Science and
Technology University, Dinajpur 5200, Bangladesh
| | - Md Rejaul Karim
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Farm Machinery and Post-harvest Processing
Engineering Division, Bangladesh Agricultural Research
Institute, Gazipur 1701, Bangladesh
| | - Shahriar Ahmed
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Hyunjin Kyoung
- Division of Animal and Dairy Science,
Chungnam National University, Daejeon 34134, Korea
| | - Gookhwan Kim
- National Institute of Agricultural
Sciences, Rural Development Administration, Jeonju 54875,
Korea
| | - Sun-Ok Chung
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| |
Collapse
|
7
|
Zhou H, Chung S, Kakar JK, Kim SC, Kim H. Pig Movement Estimation by Integrating Optical Flow with a Multi-Object Tracking Model. SENSORS (BASEL, SWITZERLAND) 2023; 23:9499. [PMID: 38067875 PMCID: PMC10708576 DOI: 10.3390/s23239499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 11/23/2023] [Accepted: 11/27/2023] [Indexed: 12/18/2023]
Abstract
Pig husbandry constitutes a significant segment within the broader framework of livestock farming, with porcine well-being emerging as a paramount concern due to its direct implications on pig breeding and production. An easily observable proxy for assessing the health of pigs lies in their daily patterns of movement. The daily movement patterns of pigs can be used as an indicator of their health, in which more active pigs are usually healthier than those who are not active, providing farmers with knowledge of identifying pigs' health state before they become sick or their condition becomes life-threatening. However, the conventional means of estimating pig mobility largely rely on manual observations by farmers, which is impractical in the context of contemporary centralized and extensive pig farming operations. In response to these challenges, multi-object tracking and pig behavior methods are adopted to monitor pig health and welfare closely. Regrettably, these existing methods frequently fall short of providing precise and quantified measurements of movement distance, thereby yielding a rudimentary metric for assessing pig health. This paper proposes a novel approach that integrates optical flow and a multi-object tracking algorithm to more accurately gauge pig movement based on both qualitative and quantitative analyses of the shortcomings of solely relying on tracking algorithms. The optical flow records accurate movement between two consecutive frames and the multi-object tracking algorithm offers individual tracks for each pig. By combining optical flow and the tracking algorithm, our approach can accurately estimate each pig's movement. Moreover, the incorporation of optical flow affords the capacity to discern partial movements, such as instances where only the pig's head is in motion while the remainder of its body remains stationary. The experimental results show that the proposed method has superiority over the method of solely using tracking results, i.e., bounding boxes. The reason is that the movement calculated based on bounding boxes is easily affected by the size fluctuation while the optical flow data can avoid these drawbacks and even provide more fine-grained motion information. The virtues inherent in the proposed method culminate in the provision of more accurate and comprehensive information, thus enhancing the efficacy of decision-making and management processes within the realm of pig farming.
Collapse
Affiliation(s)
- Heng Zhou
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Seyeon Chung
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Junaid Khan Kakar
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Sang Cheol Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Hyongsuk Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
- Department of Electronics Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
| |
Collapse
|
8
|
An L, Ren J, Yu T, Hai T, Jia Y, Liu Y. Three-dimensional surface motion capture of multiple freely moving pigs using MAMMAL. Nat Commun 2023; 14:7727. [PMID: 38001106 PMCID: PMC10673844 DOI: 10.1038/s41467-023-43483-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 11/09/2023] [Indexed: 11/26/2023] Open
Abstract
Understandings of the three-dimensional social behaviors of freely moving large-size mammals are valuable for both agriculture and life science, yet challenging due to occlusions in close interactions. Although existing animal pose estimation methods captured keypoint trajectories, they ignored deformable surfaces which contained geometric information essential for social interaction prediction and for dealing with the occlusions. In this study, we develop a Multi-Animal Mesh Model Alignment (MAMMAL) system based on an articulated surface mesh model. Our self-designed MAMMAL algorithms automatically enable us to align multi-view images into our mesh model and to capture 3D surface motions of multiple animals, which display better performance upon severe occlusions compared to traditional triangulation and allow complex social analysis. By utilizing MAMMAL, we are able to quantitatively analyze the locomotion, postures, animal-scene interactions, social interactions, as well as detailed tail motions of pigs. Furthermore, experiments on mouse and Beagle dogs demonstrate the generalizability of MAMMAL across different environments and mammal species.
Collapse
Affiliation(s)
- Liang An
- Department of Automation, Tsinghua University, Beijing, China
| | - Jilong Ren
- State Key Laboratory of Stem Cell and Reproductive Biology, Institute of Zoology, Chinese Academy of Sciences, Beijing, China
- Beijing Farm Animal Research Center, Institute of Zoology, Chinese Academy of Sciences, Beijing, China
| | - Tao Yu
- Department of Automation, Tsinghua University, Beijing, China
- Tsinghua University Beijing National Research Center for Information Science and Technology (BNRist), Beijing, China
| | - Tang Hai
- State Key Laboratory of Stem Cell and Reproductive Biology, Institute of Zoology, Chinese Academy of Sciences, Beijing, China.
- Beijing Farm Animal Research Center, Institute of Zoology, Chinese Academy of Sciences, Beijing, China.
| | - Yichang Jia
- School of Medicine, Tsinghua University, Beijing, China.
- IDG/McGovern Institute for Brain Research at Tsinghua, Beijing, China.
- Tsinghua Laboratory of Brain and Intelligence, Beijing, China.
| | - Yebin Liu
- Department of Automation, Tsinghua University, Beijing, China.
- Institute for Brain and Cognitive Sciences, Tsinghua University, Beijing, China.
| |
Collapse
|
9
|
Voogt AM, Schrijver RS, Temürhan M, Bongers JH, Sijm DTHM. Opportunities for Regulatory Authorities to Assess Animal-Based Measures at the Slaughterhouse Using Sensor Technology and Artificial Intelligence: A Review. Animals (Basel) 2023; 13:3028. [PMID: 37835634 PMCID: PMC10571985 DOI: 10.3390/ani13193028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Revised: 09/19/2023] [Accepted: 09/20/2023] [Indexed: 10/15/2023] Open
Abstract
Animal-based measures (ABMs) are the preferred way to assess animal welfare. However, manual scoring of ABMs is very time-consuming during the meat inspection. Automatic scoring by using sensor technology and artificial intelligence (AI) may bring a solution. Based on review papers an overview was made of ABMs recorded at the slaughterhouse for poultry, pigs and cattle and applications of sensor technology to measure the identified ABMs. Also, relevant legislation and work instructions of the Dutch Regulatory Authority (RA) were scanned on applied ABMs. Applications of sensor technology in a research setting, on farm or at the slaughterhouse were reported for 10 of the 37 ABMs identified for poultry, 4 of 32 for cattle and 13 of 41 for pigs. Several applications are related to aspects of meat inspection. However, by European law meat inspection must be performed by an official veterinarian, although there are exceptions for the post mortem inspection of poultry. The examples in this study show that there are opportunities for using sensor technology by the RA to support the inspection and to give more insight into animal welfare risks. The lack of external validation for multiple commercially available systems is a point of attention.
Collapse
Affiliation(s)
- Annika M. Voogt
- Office for Risk Assessment & Research (BuRO), Netherlands Food and Consumer Product Safety Authority (NVWA), P.O. Box 43006, 3540 AA Utrecht, The Netherlands
| | | | | | | | | |
Collapse
|
10
|
Wang S, Jiang H, Qiao Y, Jiang S. A Method for Obtaining 3D Point Cloud Data by Combining 2D Image Segmentation and Depth Information of Pigs. Animals (Basel) 2023; 13:2472. [PMID: 37570282 PMCID: PMC10417003 DOI: 10.3390/ani13152472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Revised: 07/21/2023] [Accepted: 07/25/2023] [Indexed: 08/13/2023] Open
Abstract
This paper proposes a method for automatic pig detection and segmentation using RGB-D data for precision livestock farming. The proposed method combines the enhanced YOLOv5s model with the Res2Net bottleneck structure, resulting in improved fine-grained feature extraction and ultimately enhancing the precision of pig detection and segmentation in 2D images. Additionally, the method facilitates the acquisition of 3D point cloud data of pigs in a simpler and more efficient way by using the pig mask obtained in 2D detection and segmentation and combining it with depth information. To evaluate the effectiveness of the proposed method, two datasets were constructed. The first dataset consists of 5400 images captured in various pig pens under diverse lighting conditions, while the second dataset was obtained from the UK. The experimental results demonstrated that the improved YOLOv5s_Res2Net achieved a mAP@0.5:0.95 of 89.6% and 84.8% for both pig detection and segmentation tasks on our dataset, while achieving a mAP@0.5:0.95 of 93.4% and 89.4% on the Edinburgh pig behaviour dataset. This approach provides valuable insights for improving pig management, conducting welfare assessments, and estimating weight accurately.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Yongliang Qiao
- Australian Institute for Machine Learning (AIML), The University of Adelaide, Adelaide, SA 5005, Australia
| | - Shuzhen Jiang
- Key Laboratory of Efficient Utilisation of Non-Grain Feed Resources (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Department of Animal Science and Technology, Shandong Agricultural University, Tai’an 271018, China;
| |
Collapse
|
11
|
Kühnemund A, Götz S, Recke G. Automatic Detection of Group Recumbency in Pigs via AI-Supported Camera Systems. Animals (Basel) 2023; 13:2205. [PMID: 37444003 DOI: 10.3390/ani13132205] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Revised: 06/23/2023] [Accepted: 06/29/2023] [Indexed: 07/15/2023] Open
Abstract
The resting behavior of rearing pigs provides information about their perception of the current temperature. A pen that is too cold or too warm can impact the well-being of the animals as well as their physical development. Previous studies that have automatically recorded animal behavior often utilized body posture. However, this method is error-prone because hidden animals (so-called false positives) strongly influence the results. In the present study, a method was developed for the automated identification of time periods in which all pigs are lying down using video recordings (an AI-supported camera system). We used velocity data (measured by the camera) of pigs in the pen to identify these periods. To determine the threshold value for images with the highest probability of containing only recumbent pigs, a dataset with 9634 images and velocity values was used. The resulting velocity threshold (0.0006020622 m/s) yielded an accuracy of 94.1%. Analysis of the testing dataset revealed that recumbent pigs were correctly identified based on velocity values derived from video recordings. This represents an advance toward automated detection from the previous manual detection method.
Collapse
Affiliation(s)
- Alexander Kühnemund
- Hochschule Osnabrück, Fachbereich Landwirtschaftliche Betriebswirtschaftslehre, Oldenburger Landstraße 24, 49090 Osnabrück, Germany
| | - Sven Götz
- VetVise GmbH, Bünteweg 2, 30559 Hannover, Germany
| | - Guido Recke
- Hochschule Osnabrück, Fachbereich Landwirtschaftliche Betriebswirtschaftslehre, Oldenburger Landstraße 24, 49090 Osnabrück, Germany
| |
Collapse
|
12
|
Jiang B, Tang W, Cui L, Deng X. Precision Livestock Farming Research: A Global Scientometric Review. Animals (Basel) 2023; 13:2096. [PMID: 37443894 DOI: 10.3390/ani13132096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Revised: 06/16/2023] [Accepted: 06/21/2023] [Indexed: 07/15/2023] Open
Abstract
Precision livestock farming (PLF) utilises information technology to continuously monitor and manage livestock in real-time, which can improve individual animal health, welfare, productivity and the environmental impact of animal husbandry, contributing to the economic, social and environmental sustainability of livestock farming. PLF has emerged as a pivotal area of multidisciplinary interest. In order to clarify the knowledge evolution and hotspot replacement of PLF research, based on the relevant data from the Web of Science database from 1973 to 2023, this study analyzed the main characteristics, research cores and hot topics of PLF research via CiteSpace. The results point to a significant increase in studies on PLF, with countries having advanced livestock farming systems in Europe and America publishing frequently and collaborating closely across borders. Universities in various countries have been leading the research, with Daniel Berckmans serving as the academic leader. Research primarily focuses on animal science, veterinary science, computer science, agricultural engineering, and environmental science. Current research hotspots center around precision dairy and cattle technology, intelligent systems, and animal behavior, with deep learning, accelerometer, automatic milking systems, lameness, estrus detection, and electronic identification being the main research directions, and deep learning and machine learning represent the forefront of current research. Research hot topics mainly include social science in PLF, the environmental impact of PLF, information technology in PLF, and animal welfare in PLF. Future research in PLF should prioritize inter-institutional and inter-scholar communication and cooperation, integration of multidisciplinary and multimethod research approaches, and utilization of deep learning and machine learning. Furthermore, social science issues should be given due attention in PLF, and the integration of intelligent technologies in animal management should be strengthened, with a focus on animal welfare and the environmental impact of animal husbandry, to promote its sustainable development.
Collapse
Affiliation(s)
- Bing Jiang
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
- Development Research Center of Modern Agriculture, Northeast Agricultural University, Harbin 150030, China
| | - Wenjie Tang
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| | - Lihang Cui
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| | - Xiaoshang Deng
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| |
Collapse
|
13
|
Double-Camera Fusion System for Animal-Position Awareness in Farming Pens. Foods 2022; 12:foods12010084. [PMID: 36613301 PMCID: PMC9818956 DOI: 10.3390/foods12010084] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 12/06/2022] [Accepted: 12/09/2022] [Indexed: 12/29/2022] Open
Abstract
In livestock breeding, continuous and objective monitoring of animals is manually unfeasible due to the large scale of breeding and expensive labour. Computer vision technology can generate accurate and real-time individual animal or animal group information from video surveillance. However, the frequent occlusion between animals and changes in appearance features caused by varying lighting conditions makes single-camera systems less attractive. We propose a double-camera system and image registration algorithms to spatially fuse the information from different viewpoints to solve these issues. This paper presents a deformable learning-based registration framework, where the input image pairs are initially linearly pre-registered. Then, an unsupervised convolutional neural network is employed to fit the mapping from one view to another, using a large number of unlabelled samples for training. The learned parameters are then used in a semi-supervised network and fine-tuned with a small number of manually annotated landmarks. The actual pixel displacement error is introduced as a complement to an image similarity measure. The performance of the proposed fine-tuned method is evaluated on real farming datasets and demonstrates significant improvement in lowering the registration errors than commonly used feature-based and intensity-based methods. This approach also reduces the registration time of an unseen image pair to less than 0.5 s. The proposed method provides a high-quality reference processing step for improving subsequent tasks such as multi-object tracking and behaviour recognition of animals for further analysis.
Collapse
|
14
|
Grandin T. Practical Application of the Five Domains Animal Welfare Framework for Supply Food Animal Chain Managers. Animals (Basel) 2022; 12:2831. [PMID: 36290216 PMCID: PMC9597751 DOI: 10.3390/ani12202831] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Revised: 10/06/2022] [Accepted: 10/12/2022] [Indexed: 11/16/2022] Open
Abstract
The author has worked as a consultant with global commercial supply managers for over 20 years. The focus of this commentary will be practical application of The Five Domains Model in commercial systems. Commercial buyers of meat need simple easy-to-use guidelines. They have to use auditors that can be trained in a workshop that lasts for only a few days. Auditing of slaughter plants by major buyers has resulted in great improvements. Supply chain managers need clear guidance on conditions that would result in a failed audit. Animal based outcome measures that can be easily assessed should be emphasized in commercial systems. Some examples of these key animal welfare indicators are: percentage of animals stunned effectively with a single application of the stunner, percentage of lame animals, foot pad lesions on poultry, and body condition scoring. A farm that supplies a buyer must also comply with housing specifications. The farm either has the specified housing or does not have it. It will be removed from the approved supplier list if housing does not comply. These types of easy to assess indicators can be easily evaluated within the four domains of nutrition, environment, health and behavioral interactions. The Five Domains Framework can also be used in a program for continuous improvement of animal welfare.
Collapse
Affiliation(s)
- Temple Grandin
- Department of Animal Science, Colorado State University, Fort Collins, CO 80526, USA
| |
Collapse
|