1
|
Hou G, Li R, Tian M, Ding J, Zhang X, Yang B, Chen C, Huang R, Yin Y. Improving Efficiency: Automatic Intelligent Weighing System as a Replacement for Manual Pig Weighing. Animals (Basel) 2024; 14:1614. [PMID: 38891661 PMCID: PMC11171250 DOI: 10.3390/ani14111614] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2024] [Revised: 05/27/2024] [Accepted: 05/27/2024] [Indexed: 06/21/2024] Open
Abstract
To verify the accuracy of AIWS, we weighed 106 pen growing-finishing pigs' weights using both the manual and AIWS methods, respectively. Accuracy was evaluated based on the values of MAE, MAPE, and RMSE. In the growth experiment, manual weighing was conducted every two weeks and AIWS predicted weight data was recorded daily, followed by fitting the growth curves. The results showed that MAE, MAPE, and RMSE values for 60 to 120 kg pigs were 3.48 kg, 3.71%, and 4.43 kg, respectively. The correlation coefficient r between the AIWS and manual method was 0.9410, and R2 was 0.8854. The two were extremely significant correlations (p < 0.001). In growth curve fitting, the AIWS method has lower AIC and BIC values than the manual method. The Logistic model by AIWS was the best-fit model. The age and body weight at the inflection point of the best-fit model were 164.46 d and 93.45 kg, respectively. The maximum growth rate was 831.66 g/d. In summary, AIWS can accurately predict pigs' body weights in actual production and has a better fitting effect on the growth curves of growing-finishing pigs. This study suggested that it was feasible for AIWS to replace manual weighing to measure the weight of 50 to 120 kg live pigs in large-scale farming.
Collapse
Affiliation(s)
- Gaifeng Hou
- CAS Key Laboratory of Agro-Ecological Processes in Subtropical Region, Hunan Provincial Key Laboratory of Animal Nutritional Physiology and Metabolic Process, Hunan Research Center of Livestock and Poultry Sciences, South Central Experimental Station of Animal Nutrition and Feed Science in the Ministry of Agriculture, National Engineering Laboratory for Poultry Breeding Pollution Control and Resource Technology, Institute of Subtropical Agriculture, Chinese Academy of Sciences, Changsha 410125, China; (G.H.); (R.L.); (M.T.); (J.D.)
| | - Rui Li
- CAS Key Laboratory of Agro-Ecological Processes in Subtropical Region, Hunan Provincial Key Laboratory of Animal Nutritional Physiology and Metabolic Process, Hunan Research Center of Livestock and Poultry Sciences, South Central Experimental Station of Animal Nutrition and Feed Science in the Ministry of Agriculture, National Engineering Laboratory for Poultry Breeding Pollution Control and Resource Technology, Institute of Subtropical Agriculture, Chinese Academy of Sciences, Changsha 410125, China; (G.H.); (R.L.); (M.T.); (J.D.)
| | - Mingzhou Tian
- CAS Key Laboratory of Agro-Ecological Processes in Subtropical Region, Hunan Provincial Key Laboratory of Animal Nutritional Physiology and Metabolic Process, Hunan Research Center of Livestock and Poultry Sciences, South Central Experimental Station of Animal Nutrition and Feed Science in the Ministry of Agriculture, National Engineering Laboratory for Poultry Breeding Pollution Control and Resource Technology, Institute of Subtropical Agriculture, Chinese Academy of Sciences, Changsha 410125, China; (G.H.); (R.L.); (M.T.); (J.D.)
| | - Jing Ding
- CAS Key Laboratory of Agro-Ecological Processes in Subtropical Region, Hunan Provincial Key Laboratory of Animal Nutritional Physiology and Metabolic Process, Hunan Research Center of Livestock and Poultry Sciences, South Central Experimental Station of Animal Nutrition and Feed Science in the Ministry of Agriculture, National Engineering Laboratory for Poultry Breeding Pollution Control and Resource Technology, Institute of Subtropical Agriculture, Chinese Academy of Sciences, Changsha 410125, China; (G.H.); (R.L.); (M.T.); (J.D.)
| | - Xingfu Zhang
- College of Computer Science and Technology, Heilongjiang Institute of Technology, Harbin 150050, China;
- Beijing Focused Loong Technology Co., Ltd., Beijing 100086, China
| | - Bin Yang
- Key Laboratory of Visual Perception and Artificial Intelligence of Hunan Province, College of Electrical and Information Engineering, Hunan University, Changsha 410082, China;
| | - Chunyu Chen
- College of Information and Communication, Harbin Engineering University, Harbin 150001, China;
| | - Ruilin Huang
- CAS Key Laboratory of Agro-Ecological Processes in Subtropical Region, Hunan Provincial Key Laboratory of Animal Nutritional Physiology and Metabolic Process, Hunan Research Center of Livestock and Poultry Sciences, South Central Experimental Station of Animal Nutrition and Feed Science in the Ministry of Agriculture, National Engineering Laboratory for Poultry Breeding Pollution Control and Resource Technology, Institute of Subtropical Agriculture, Chinese Academy of Sciences, Changsha 410125, China; (G.H.); (R.L.); (M.T.); (J.D.)
| | - Yulong Yin
- CAS Key Laboratory of Agro-Ecological Processes in Subtropical Region, Hunan Provincial Key Laboratory of Animal Nutritional Physiology and Metabolic Process, Hunan Research Center of Livestock and Poultry Sciences, South Central Experimental Station of Animal Nutrition and Feed Science in the Ministry of Agriculture, National Engineering Laboratory for Poultry Breeding Pollution Control and Resource Technology, Institute of Subtropical Agriculture, Chinese Academy of Sciences, Changsha 410125, China; (G.H.); (R.L.); (M.T.); (J.D.)
| |
Collapse
|
2
|
Luo Y, Xia J, Lu H, Luo H, Lv E, Zeng Z, Li B, Meng F, Yang A. Automatic Recognition and Quantification Feeding Behaviors of Nursery Pigs Using Improved YOLOV5 and Feeding Functional Area Proposals. Animals (Basel) 2024; 14:569. [PMID: 38396538 PMCID: PMC10886382 DOI: 10.3390/ani14040569] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Revised: 02/05/2024] [Accepted: 02/05/2024] [Indexed: 02/25/2024] Open
Abstract
A novel method is proposed based on the improved YOLOV5 and feeding functional area proposals to identify the feeding behaviors of nursery piglets in a complex light and different posture environment. The method consists of three steps: first, the corner coordinates of the feeding functional area were set up by using the shape characteristics of the trough proposals and the ratio of the corner point to the image width and height to separate the irregular feeding area; second, a transformer module model was introduced based on YOLOV5 for highly accurate head detection; and third, the feeding behavior was recognized and counted by calculating the proportion of the head in the located feeding area. The pig head dataset was constructed, including 5040 training sets with 54,670 piglet head boxes, and 1200 test sets, and 25,330 piglet head boxes. The improved model achieves a 5.8% increase in the mAP and a 4.7% increase in the F1 score compared with the YOLOV5s model. The model is also applied to analyze the feeding pattern of group-housed nursery pigs in 24 h continuous monitoring and finds that nursing pigs have different feeding rhythms for the day and night, with peak feeding periods at 7:00-9:00 and 15:00-17:00 and decreased feeding periods at 12:00-14:00 and 0:00-6:00. The model provides a solution for identifying and quantifying pig feeding behaviors and offers a data basis for adjusting the farm feeding scheme.
Collapse
Affiliation(s)
- Yizhi Luo
- Institute of Facility Agriculture, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China; (Y.L.); (H.L.)
- State Key Laboratory of Swine and Poultry Breeding Industry, Guangzhou 510645, China; (H.L.); (E.L.); (B.L.); (F.M.)
| | - Jinjin Xia
- College of Engineering, South China Agricultural University, Guangzhou 510642, China; (J.X.); (Z.Z.)
| | - Huazhong Lu
- State Key Laboratory of Swine and Poultry Breeding Industry, Guangzhou 510645, China; (H.L.); (E.L.); (B.L.); (F.M.)
| | - Haowen Luo
- Institute of Facility Agriculture, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China; (Y.L.); (H.L.)
- State Key Laboratory of Swine and Poultry Breeding Industry, Guangzhou 510645, China; (H.L.); (E.L.); (B.L.); (F.M.)
| | - Enli Lv
- State Key Laboratory of Swine and Poultry Breeding Industry, Guangzhou 510645, China; (H.L.); (E.L.); (B.L.); (F.M.)
- College of Engineering, South China Agricultural University, Guangzhou 510642, China; (J.X.); (Z.Z.)
| | - Zhixiong Zeng
- College of Engineering, South China Agricultural University, Guangzhou 510642, China; (J.X.); (Z.Z.)
| | - Bin Li
- State Key Laboratory of Swine and Poultry Breeding Industry, Guangzhou 510645, China; (H.L.); (E.L.); (B.L.); (F.M.)
| | - Fanming Meng
- State Key Laboratory of Swine and Poultry Breeding Industry, Guangzhou 510645, China; (H.L.); (E.L.); (B.L.); (F.M.)
- Institute of Animal Science, Guangdong Academy of Agricultural Sciences, Guangzhou 510645, China
| | - Aqing Yang
- College of Computer Science, Guangdong Polytechnic Normal University, Guangzhou 510665, China
| |
Collapse
|
3
|
An L, Ren J, Yu T, Hai T, Jia Y, Liu Y. Three-dimensional surface motion capture of multiple freely moving pigs using MAMMAL. Nat Commun 2023; 14:7727. [PMID: 38001106 PMCID: PMC10673844 DOI: 10.1038/s41467-023-43483-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 11/09/2023] [Indexed: 11/26/2023] Open
Abstract
Understandings of the three-dimensional social behaviors of freely moving large-size mammals are valuable for both agriculture and life science, yet challenging due to occlusions in close interactions. Although existing animal pose estimation methods captured keypoint trajectories, they ignored deformable surfaces which contained geometric information essential for social interaction prediction and for dealing with the occlusions. In this study, we develop a Multi-Animal Mesh Model Alignment (MAMMAL) system based on an articulated surface mesh model. Our self-designed MAMMAL algorithms automatically enable us to align multi-view images into our mesh model and to capture 3D surface motions of multiple animals, which display better performance upon severe occlusions compared to traditional triangulation and allow complex social analysis. By utilizing MAMMAL, we are able to quantitatively analyze the locomotion, postures, animal-scene interactions, social interactions, as well as detailed tail motions of pigs. Furthermore, experiments on mouse and Beagle dogs demonstrate the generalizability of MAMMAL across different environments and mammal species.
Collapse
Affiliation(s)
- Liang An
- Department of Automation, Tsinghua University, Beijing, China
| | - Jilong Ren
- State Key Laboratory of Stem Cell and Reproductive Biology, Institute of Zoology, Chinese Academy of Sciences, Beijing, China
- Beijing Farm Animal Research Center, Institute of Zoology, Chinese Academy of Sciences, Beijing, China
| | - Tao Yu
- Department of Automation, Tsinghua University, Beijing, China
- Tsinghua University Beijing National Research Center for Information Science and Technology (BNRist), Beijing, China
| | - Tang Hai
- State Key Laboratory of Stem Cell and Reproductive Biology, Institute of Zoology, Chinese Academy of Sciences, Beijing, China.
- Beijing Farm Animal Research Center, Institute of Zoology, Chinese Academy of Sciences, Beijing, China.
| | - Yichang Jia
- School of Medicine, Tsinghua University, Beijing, China.
- IDG/McGovern Institute for Brain Research at Tsinghua, Beijing, China.
- Tsinghua Laboratory of Brain and Intelligence, Beijing, China.
| | - Yebin Liu
- Department of Automation, Tsinghua University, Beijing, China.
- Institute for Brain and Cognitive Sciences, Tsinghua University, Beijing, China.
| |
Collapse
|
4
|
Xu Y, Nie J, Cen H, Wen B, Liu S, Li J, Ge J, Yu L, Pu Y, Song K, Liu Z, Cai Q. Spatio-Temporal-Based Identification of Aggressive Behavior in Group Sheep. Animals (Basel) 2023; 13:2636. [PMID: 37627427 PMCID: PMC10451720 DOI: 10.3390/ani13162636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 08/04/2023] [Accepted: 08/11/2023] [Indexed: 08/27/2023] Open
Abstract
In order to solve the problems of low efficiency and subjectivity of manual observation in the process of group-sheep-aggression detection, we propose a video streaming-based model for detecting aggressive behavior in group sheep. In the experiment, we collected videos of the sheep's daily routine and videos of the aggressive behavior of sheep in the sheep pen. Using the open-source software LabelImg, we labeled the data with bounding boxes. Firstly, the YOLOv5 detects all sheep in each frame of the video and outputs the coordinates information. Secondly, we sort the sheep's coordinates using a sheep tracking heuristic proposed in this paper. Finally, the sorted data are fed into an LSTM framework to predict the occurrence of aggression. To optimize the model's parameters, we analyze the confidence, batch size and skipping frame. The best-performing model from our experiments has 93.38% Precision and 91.86% Recall. Additionally, we compare our video streaming-based model with image-based models for detecting aggression in group sheep. In sheep aggression, the video stream detection model can solve the false detection phenomenon caused by head impact feature occlusion of aggressive sheep in the image detection model.
Collapse
Affiliation(s)
- Yalei Xu
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
- College of Information Science and Technology, Zhongkai University of Agriculture and Engineering, Guangzhou 510225, China
| | - Jing Nie
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Honglei Cen
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Baoqin Wen
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Shuangyin Liu
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- College of Information Science and Technology, Zhongkai University of Agriculture and Engineering, Guangzhou 510225, China
| | - Jingbin Li
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Jianbing Ge
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Longhui Yu
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
- College of Information Science and Technology, Zhongkai University of Agriculture and Engineering, Guangzhou 510225, China
| | - Yuhai Pu
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Kangle Song
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Zichen Liu
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| | - Qiang Cai
- College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China (Z.L.); (Q.C.)
- Xinjiang Production and Construction Corps Key Laboratory of Modern Agricultural Machinery, Shihezi 832003, China
- Industrial Technology Research Institute of Xinjiang Production and Construction Corps, Shihezi 832000, China
| |
Collapse
|
5
|
Kühnemund A, Götz S, Recke G. Automatic Detection of Group Recumbency in Pigs via AI-Supported Camera Systems. Animals (Basel) 2023; 13:2205. [PMID: 37444003 DOI: 10.3390/ani13132205] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Revised: 06/23/2023] [Accepted: 06/29/2023] [Indexed: 07/15/2023] Open
Abstract
The resting behavior of rearing pigs provides information about their perception of the current temperature. A pen that is too cold or too warm can impact the well-being of the animals as well as their physical development. Previous studies that have automatically recorded animal behavior often utilized body posture. However, this method is error-prone because hidden animals (so-called false positives) strongly influence the results. In the present study, a method was developed for the automated identification of time periods in which all pigs are lying down using video recordings (an AI-supported camera system). We used velocity data (measured by the camera) of pigs in the pen to identify these periods. To determine the threshold value for images with the highest probability of containing only recumbent pigs, a dataset with 9634 images and velocity values was used. The resulting velocity threshold (0.0006020622 m/s) yielded an accuracy of 94.1%. Analysis of the testing dataset revealed that recumbent pigs were correctly identified based on velocity values derived from video recordings. This represents an advance toward automated detection from the previous manual detection method.
Collapse
Affiliation(s)
- Alexander Kühnemund
- Hochschule Osnabrück, Fachbereich Landwirtschaftliche Betriebswirtschaftslehre, Oldenburger Landstraße 24, 49090 Osnabrück, Germany
| | - Sven Götz
- VetVise GmbH, Bünteweg 2, 30559 Hannover, Germany
| | - Guido Recke
- Hochschule Osnabrück, Fachbereich Landwirtschaftliche Betriebswirtschaftslehre, Oldenburger Landstraße 24, 49090 Osnabrück, Germany
| |
Collapse
|
6
|
Ji H, Teng G, Yu J, Wen Y, Deng H, Zhuang Y. Efficient Aggressive Behavior Recognition of Pigs Based on Temporal Shift Module. Animals (Basel) 2023; 13:2078. [PMID: 37443876 DOI: 10.3390/ani13132078] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2023] [Revised: 06/18/2023] [Accepted: 06/20/2023] [Indexed: 07/15/2023] Open
Abstract
Aggressive behavior among pigs is a significant social issue that has severe repercussions on both the profitability and welfare of pig farms. Due to the complexity of aggression, recognizing it requires the consideration of both spatial and temporal features. To address this problem, we proposed an efficient method that utilizes the temporal shift module (TSM) for automatic recognition of pig aggression. In general, TSM is inserted into four 2D convolutional neural network models, including ResNet50, ResNeXt50, DenseNet201, and ConvNext-t, enabling the models to process both spatial and temporal features without increasing the model parameters and computational complexity. The proposed method was evaluated on the dataset established in this study, and the results indicate that the ResNeXt50-T (TSM inserted into ResNeXt50) model achieved the best balance between recognition accuracy and model parameters. On the test set, the ResNeXt50-T model achieved accuracy, recall, precision, F1 score, speed, and model parameters of 95.69%, 95.25%, 96.07%, 95.65%, 29 ms, and 22.98 M, respectively. These results show that the proposed method can effectively improve the accuracy of recognizing pig aggressive behavior and provide a reference for behavior recognition in actual scenarios of smart livestock farming.
Collapse
Affiliation(s)
- Hengyi Ji
- College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
- Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| | - Guanghui Teng
- College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
- Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| | - Jionghua Yu
- Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| | - Yanbin Wen
- Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
- Bureau of Agricultural and Rural Affairs, Datong 037000, China
| | - Huixiang Deng
- College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
- Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| | - Yanrong Zhuang
- College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
- Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| |
Collapse
|
7
|
Hao W, Zhang K, Zhang L, Han M, Hao W, Li F, Yang G. TSML: A New Pig Behavior Recognition Method Based on Two-Stream Mutual Learning Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23115092. [PMID: 37299818 DOI: 10.3390/s23115092] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 05/18/2023] [Accepted: 05/22/2023] [Indexed: 06/12/2023]
Abstract
Changes in pig behavior are crucial information in the livestock breeding process, and automatic pig behavior recognition is a vital method for improving pig welfare. However, most methods for pig behavior recognition rely on human observation and deep learning. Human observation is often time-consuming and labor-intensive, while deep learning models with a large number of parameters can result in slow training times and low efficiency. To address these issues, this paper proposes a novel deep mutual learning enhanced two-stream pig behavior recognition approach. The proposed model consists of two mutual learning networks, which include the red-green-blue color model (RGB) and flow streams. Additionally, each branch contains two student networks that learn collaboratively to effectively achieve robust and rich appearance or motion features, ultimately leading to improved recognition performance of pig behaviors. Finally, the results of RGB and flow branches are weighted and fused to further improve the performance of pig behavior recognition. Experimental results demonstrate the effectiveness of the proposed model, which achieves state-of-the-art recognition performance with an accuracy of 96.52%, surpassing other models by 2.71%.
Collapse
Affiliation(s)
- Wangli Hao
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Kai Zhang
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Li Zhang
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Meng Han
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Wangbao Hao
- Yuncheng National Jinnan Cattle Genetic Resources and Gene Protection Center, Yongji 044099, China
| | - Fuzhong Li
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| | - Guoqiang Yang
- School of Software, Shanxi Agricultural University, Jinzhong 030801, China
| |
Collapse
|
8
|
Czycholl I, Büttner K, Becker D, Schwennen C, Baumgärtner W, Otten W, Wendt M, Puff C, Krieter J. Are biters sick? Health status of tail biters in comparison to control pigs. Porcine Health Manag 2023; 9:19. [PMID: 37161469 PMCID: PMC10170755 DOI: 10.1186/s40813-023-00314-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2022] [Accepted: 03/27/2023] [Indexed: 05/11/2023] Open
Abstract
BACKGROUND Tail biting is a multifactorial problem. As the health status is one of the factors commonly linked to tail biting, this study focuses on the health of identified biters. 30 (obsessive) biters are compared to 30 control animals by clinical and pathological examination as well as blood and cerebrospinal fluid samples. In that way, altogether 174 variables are compared between the groups. Moreover, connections between the variables are analysed. RESULTS In the clinical examination, 6 biters, but only 2 controls (P = 0.019) were noticeably agitated in the evaluation of general behaviour, while 8 controls were noticeably calmer (2 biters, P = 0.02). Biters had a lower body weight (P = 0.0007) and 13 biters had overlong bristles (4 controls, P = 0.008). In the pathological examination, 5 biters, but none of the controls had a hyperceratosis or inflammation of the pars proventricularis of the stomach (P = 0.018). However, 7 controls and only 3 biters were affected by gut inflammation (P = 0.03). In the blood sample, protein and albumin levels were below normal range for biters (protein: 51.6 g/l, albumin: 25.4 g/l), but not for controls (protein: 53.7 g/l, albumin: 27.4 g/l), (protein: P = 0.05, albumin: P = 0.02). Moreover, 14 biters, but only 8 controls had poikilocytosis (P = 0.05). Although not statistically different between groups, many animals (36/60) were affected by hypoproteinemia and hyponatremia as well as by hypokalemia (53/60) and almost all animals (58/60) had hypomagnesemia. For hypomagnesemia, significant connections with variables linked to tail damage and ear necrosis were detected (rs/V/ρ ≥ 0.4, P ≤ 0.05). CONCLUSION The results suggest that behavioural tests might be helpful in identifying biters. Moreover, cornification and inflammation of the pars proventricularis is linked to becoming a biter. Furthermore, the results highlight the need for appropriate and adjusted nutrient and mineral supply, especially with regard to magnesium.
Collapse
Affiliation(s)
- I Czycholl
- Institute of Animal Breeding and Husbandry, Kiel University, 24098, Kiel, Germany.
- Pig Improvement Company (PIC), Hendersonville, TN, 37075, USA.
- Department for Animal Welfare and Disease Control, University of Copenhagen, 1870, Frederiksberg, Denmark.
| | - K Büttner
- Unit for Biomathematics and Data Processing, Faculty of Veterinary Medicine, Justus Liebig University, 35392, Giessen, Germany
| | - D Becker
- Institute of Genome Biology, Research Institute for Farm Animal Biology (FBN), 18196, Dummerstorf, Germany
| | - C Schwennen
- Clinic for Swine, Small Ruminants and Forensic Medicine and Ambulatory Service, University of Veterinary Medicine Hanover, Foundation, 30173, Hanover, Germany
| | - W Baumgärtner
- Department of Pathology, University of Veterinary Medicine Hanover, Foundation, 30559, Hanover, Germany
| | - W Otten
- Institute of Behavioural Physiology, Research Institute for Farm Animal Biology (FBN), 18196, Dummerstorf, Germany
| | - M Wendt
- Clinic for Swine, Small Ruminants and Forensic Medicine and Ambulatory Service, University of Veterinary Medicine Hanover, Foundation, 30173, Hanover, Germany
| | - C Puff
- Department of Pathology, University of Veterinary Medicine Hanover, Foundation, 30559, Hanover, Germany
| | - J Krieter
- Institute of Animal Breeding and Husbandry, Kiel University, 24098, Kiel, Germany
| |
Collapse
|
9
|
Girardie O, Bonneau M, Billon Y, Bailly J, David I, Canario L. Analysis of image-based sow activity patterns reveals several associations with piglet survival and early growth. Front Vet Sci 2023; 9:1051284. [PMID: 36699323 PMCID: PMC9868430 DOI: 10.3389/fvets.2022.1051284] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Accepted: 12/12/2022] [Indexed: 01/10/2023] Open
Abstract
An activity pattern describes variations in activities over time. The objectives of this study are to automatically predict sow activity from computer vision over 11 days peripartum and estimate how sow behavior influences piglet's performance during early lactation. The analysis of video images used the convolutional neural network (CNN) YOLO for sow detection and posture classification of 21 Large White and 22 Meishan primiparous sows housed in individual farrowing pens. A longitudinal analysis and a clustering method were combined to identify groups of sows with a similar activity pattern. Traits under study are as follows: (i) the distribution of time spent daily in different postures and (ii) different activities while standing. Six postures were included along with three classes of standing activities, i.e., eating, drinking, and other, which can be in motion or not and root-pawing or not. They correspond to a postural budget and a standing-activity budget. Groups of sows with similar changes in their budget over the period (D-3 to D-1; D0 and D1-D7) were identified with the k-means clustering method. Next, behavioral traits (time spent daily in each posture, frequency of postural changes) were used as explanatory variables in the Cox proportional hazards model for survival and in the linear model for growth. Piglet survival was influenced by sow behavior on D-1 and during the period D1-D7. Piglets born from sows that were standing and doing an activity other than drinking and eating on D-1 had a 26% lower risk of dying than other piglets. Those born from sows that changed posture more frequently on D1-D7 had a 44% lower risk of dying. The number of postural changes, which illustrate sow restlessness, influenced piglet growth in the three periods. The average daily gain of piglets born from sows that were more restless on D1-D7 and that changed posture more frequently to hide their udder on D0 decreased by 22 and 45 g/d, respectively. Conversely, those born from sows that changed posture more frequently to hide their udder during the period of D1-D7 grew faster (+71 g/d) than the other piglets. Sow restlessness at different time periods influenced piglet performance.
Collapse
Affiliation(s)
- Océane Girardie
- UMR1388 GenPhySE, INRAE, Université de Toulouse, INPT, Castanet-Tolosan, France
| | | | | | | | - Ingrid David
- UMR1388 GenPhySE, INRAE, Université de Toulouse, INPT, Castanet-Tolosan, France
| | - Laurianne Canario
- UMR1388 GenPhySE, INRAE, Université de Toulouse, INPT, Castanet-Tolosan, France
| |
Collapse
|
10
|
Wang S, Jiang H, Qiao Y, Jiang S, Lin H, Sun Q. The Research Progress of Vision-Based Artificial Intelligence in Smart Pig Farming. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22176541. [PMID: 36080994 PMCID: PMC9460267 DOI: 10.3390/s22176541] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Revised: 08/22/2022] [Accepted: 08/27/2022] [Indexed: 05/05/2023]
Abstract
Pork accounts for an important proportion of livestock products. For pig farming, a lot of manpower, material resources and time are required to monitor pig health and welfare. As the number of pigs in farming increases, the continued use of traditional monitoring methods may cause stress and harm to pigs and farmers and affect pig health and welfare as well as farming economic output. In addition, the application of artificial intelligence has become a core part of smart pig farming. The precision pig farming system uses sensors such as cameras and radio frequency identification to monitor biometric information such as pig sound and pig behavior in real-time and convert them into key indicators of pig health and welfare. By analyzing the key indicators, problems in pig health and welfare can be detected early, and timely intervention and treatment can be provided, which helps to improve the production and economic efficiency of pig farming. This paper studies more than 150 papers on precision pig farming and summarizes and evaluates the application of artificial intelligence technologies to pig detection, tracking, behavior recognition and sound recognition. Finally, we summarize and discuss the opportunities and challenges of precision pig farming.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Yongliang Qiao
- Australian Centre for Field Robotics (ACFR), Faculty of Engineering, The University of Sydney, Sydney, NSW 2006, Australia
- Correspondence:
| | - Shuzhen Jiang
- College of Animal Science and Veterinary Medicine, Shandong Agricultural University, Tai’an 271018, China
| | - Huaiqin Lin
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| | - Qian Sun
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
| |
Collapse
|
11
|
Postural behavior recognition of captive nocturnal animals based on deep learning: a case study of Bengal slow loris. Sci Rep 2022; 12:7738. [PMID: 35545645 PMCID: PMC9095646 DOI: 10.1038/s41598-022-11842-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Accepted: 04/29/2022] [Indexed: 12/05/2022] Open
Abstract
The precise identification of postural behavior plays a crucial role in evaluation of animal welfare and captive management. Deep learning technology has been widely used in automatic behavior recognition of wild and domestic fauna species. The Asian slow loris is a group of small, nocturnal primates with a distinctive locomotion mode, and a large number of individuals were confiscated into captive settings due to illegal trade, making the species an ideal as a model for postural behavior monitoring. Captive animals may suffer from being housed in an inappropriate environment and may display abnormal behavior patterns. Traditional data collection methods are time-consuming and laborious, impeding efforts to improve lorises’ captive welfare and to develop effective reintroduction strategies. This study established the first human-labeled postural behavior dataset of slow lorises and used deep learning technology to recognize postural behavior based on object detection and semantic segmentation. The precision of the classification based on YOLOv5 reached 95.1%. The Dilated Residual Networks (DRN) feature extraction network showed the best performance in semantic segmentation, and the classification accuracy reached 95.2%. The results imply that computer automatic identification of postural behavior may offer advantages in assessing animal activity and can be applied to other nocturnal taxa.
Collapse
|
12
|
Automated Tracking Systems for the Assessment of Farmed Poultry. Animals (Basel) 2022; 12:ani12030232. [PMID: 35158556 PMCID: PMC8833357 DOI: 10.3390/ani12030232] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Revised: 01/16/2022] [Accepted: 01/18/2022] [Indexed: 02/05/2023] Open
Abstract
Simple Summary With the advent of artificial intelligence, the poultry sector is gearing up to adopt and embrace sensor technologies to enhance the production and the welfare of birds. Automated tracking and tracing of poultry birds has several advantages in poultry farms: overcoming the subjectivity of human measurements, enhancing the ability to provide quality care for the birds during their life on the farm, providing the ability to predict events and thereby enabling timely interventions, and many more. However, the technologies behind automated tracking systems are not ripe due to the lags in algorithms and practical implementation issues. This mini review provides a brief critical assessment of the current and recent advancements of automated tracking systems in the poultry industry and offers an outlook on future directions. Abstract The world’s growing population is highly dependent on animal agriculture. Animal products provide nutrient-packed meals that help to sustain individuals of all ages in communities across the globe. As the human demand for animal proteins grows, the agricultural industry must continue to advance its efficiency and quality of production. One of the most commonly farmed livestock is poultry and their significance is felt on a global scale. Current poultry farming practices result in the premature death and rejection of billions of chickens on an annual basis before they are processed for meat. This loss of life is concerning regarding animal welfare, agricultural efficiency, and economic impacts. The best way to prevent these losses is through the individualistic and/or group level assessment of animals on a continuous basis. On large-scale farms, such attention to detail was generally considered to be inaccurate and inefficient, but with the integration of artificial intelligence (AI)-assisted technology individualised, and per-herd assessments of livestock became possible and accurate. Various studies have shown that cameras linked with specialised systems of AI can properly analyse flocks for health concerns, thus improving the survival rate and product quality of farmed poultry. Building on recent advancements, this review explores the aspects of AI in the detection, counting, and tracking of poultry in commercial and research-based applications.
Collapse
|
13
|
Racewicz P, Ludwiczak A, Skrzypczak E, Składanowska-Baryza J, Biesiada H, Nowak T, Nowaczewski S, Zaborowicz M, Stanisz M, Ślósarz P. Welfare Health and Productivity in Commercial Pig Herds. Animals (Basel) 2021; 11:1176. [PMID: 33924224 PMCID: PMC8074599 DOI: 10.3390/ani11041176] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Revised: 04/16/2021] [Accepted: 04/17/2021] [Indexed: 12/02/2022] Open
Abstract
In recent years, there have been very dynamic changes in both pork production and pig breeding technology around the world. The general trend of increasing the efficiency of pig production, with reduced employment, requires optimisation and a comprehensive approach to herd management. One of the most important elements on the way to achieving this goal is to maintain animal welfare and health. The health of the pigs on the farm is also a key aspect in production economics. The need to maintain a high health status of pig herds by eliminating the frequency of different disease units and reducing the need for antimicrobial substances is part of a broadly understood high potential herd management strategy. Thanks to the use of sensors (cameras, microphones, accelerometers, or radio-frequency identification transponders), the images, sounds, movements, and vital signs of animals are combined through algorithms and analysed for non-invasive monitoring of animals, which allows for early detection of diseases, improves their welfare, and increases the productivity of breeding. Automated, innovative early warning systems based on continuous monitoring of specific physiological (e.g., body temperature) and behavioural parameters can provide an alternative to direct diagnosis and visual assessment by the veterinarian or the herd keeper.
Collapse
Affiliation(s)
- Przemysław Racewicz
- Laboratory of Veterinary Public Health Protection, Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland;
| | - Agnieszka Ludwiczak
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Ewa Skrzypczak
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Joanna Składanowska-Baryza
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Hanna Biesiada
- Laboratory of Veterinary Public Health Protection, Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland;
| | - Tomasz Nowak
- Department of Genetics and Animal Breeding, Animal Reproduction Laboratory, Poznan University of Life Sciences, 60-637 Poznan, Poland;
| | - Sebastian Nowaczewski
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Maciej Zaborowicz
- Institute of Biosystems Engineering, Poznan University of Life Sciences, 60-637 Poznan, Poland;
| | - Marek Stanisz
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| | - Piotr Ślósarz
- Department of Animal Breeding and Product Quality Assessment, Poznan University of Life Sciences, Słoneczna 1, 62-002 Suchy Las, Poland; (A.L.); (E.S.); (J.S.-B.); (S.N.); (M.S.); (P.Ś.)
| |
Collapse
|