1
|
Hayat K, Ye Z, Lin H, Pan J. Beyond the Spectrum: Unleashing the Potential of Infrared Radiation in Poultry Industry Advancements. Animals (Basel) 2024; 14:1431. [PMID: 38791649 PMCID: PMC11117323 DOI: 10.3390/ani14101431] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2024] [Revised: 04/26/2024] [Accepted: 04/29/2024] [Indexed: 05/26/2024] Open
Abstract
The poultry industry is dynamically advancing production by focusing on nutrition, management practices, and technology to enhance productivity by improving feed conversion ratios, disease control, lighting management, and exploring antibiotic alternatives. Infrared (IR) radiation is utilized to improve the well-being of humans, animals, and poultry through various operations. IR radiation occurs via electromagnetic waves with wavelengths ranging from 760 to 10,000 nm. The biological applications of IR radiation are gaining significant attention and its utilization is expanding rapidly across multiple sectors. Various IR applications, such as IR heating, IR spectroscopy, IR thermography, IR beak trimming, and IR in computer vision, have proven to be beneficial in enhancing the well-being of humans, animals, and birds within mechanical systems. IR radiation offers a wide array of health benefits, including improved skin health, therapeutic effects, anticancer properties, wound healing capabilities, enhanced digestive and endothelial function, and improved mitochondrial function and gene expression. In the realm of poultry production, IR radiation has demonstrated numerous positive impacts, including enhanced growth performance, gut health, blood profiles, immunological response, food safety measures, economic advantages, the mitigation of hazardous gases, and improved heating systems. Despite the exceptional benefits of IR radiation, its applications in poultry production are still limited. This comprehensive review provides compelling evidence supporting the advantages of IR radiation and advocates for its wider adoption in poultry production practices.
Collapse
Affiliation(s)
- Khawar Hayat
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| | - Zunzhong Ye
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| | - Hongjian Lin
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| | - Jinming Pan
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| |
Collapse
|
2
|
Khanal R, Choi Y, Lee J. Transforming Poultry Farming: A Pyramid Vision Transformer Approach for Accurate Chicken Counting in Smart Farm Environments. SENSORS (BASEL, SWITZERLAND) 2024; 24:2977. [PMID: 38793832 PMCID: PMC11124838 DOI: 10.3390/s24102977] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/14/2024] [Revised: 04/13/2024] [Accepted: 05/06/2024] [Indexed: 05/26/2024]
Abstract
Smart farm environments, equipped with cutting-edge technology, require proficient techniques for managing poultry. This research investigates automated chicken counting, an essential part of optimizing livestock conditions. By integrating artificial intelligence and computer vision, it introduces a transformer-based chicken-counting model to overcome challenges to precise counting, such as lighting changes, occlusions, cluttered backgrounds, continual chicken growth, and camera distortions. The model includes a pyramid vision transformer backbone and a multi-scale regression head to predict precise density maps of the crowded chicken enclosure. The customized loss function incorporates curriculum loss, allowing the model to learn progressively, and adapts to diverse challenges posed by varying densities, scales, and appearances. The proposed annotated dataset includes data on various lighting conditions, chicken sizes, densities, and placements. Augmentation strategies enhanced the dataset with brightness, contrast, shadow, blur, occlusion, cropping, and scaling variations. Evaluating the model on the proposed dataset indicated its robustness, with a validation mean absolute error of 27.8, a root mean squared error of 40.9, and a test average accuracy of 96.9%. A comparison with the few-shot object counting model SAFECount demonstrated the model's superior accuracy and resilience. The transformer-based approach was 7.7% more accurate than SAFECount. It demonstrated robustness in response to different challenges that may affect counting and offered a comprehensive and effective solution for automated chicken counting in smart farm environments.
Collapse
Affiliation(s)
- Ridip Khanal
- Division of Computer Science and Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
- Department of Computer Science and Applications, Tribhuvan University, Mechi Multiple Campus, Bhadrapur 57200, Nepal
| | - Yoochan Choi
- Division of Computer Science and Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
| | - Joonwhoan Lee
- Division of Computer Science and Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
| |
Collapse
|
3
|
Qin X, Lai C, Pan Z, Pan M, Xiang Y, Wang Y. Recognition of Abnormal-Laying Hens Based on Fast Continuous Wavelet and Deep Learning Using Hyperspectral Images. SENSORS (BASEL, SWITZERLAND) 2023; 23:3645. [PMID: 37050705 PMCID: PMC10098863 DOI: 10.3390/s23073645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/04/2023] [Revised: 03/26/2023] [Accepted: 03/27/2023] [Indexed: 06/19/2023]
Abstract
The egg production of laying hens is crucial to breeding enterprises in the laying hen breeding industry. However, there is currently no systematic or accurate method to identify low-egg-production-laying hens in commercial farms, and the majority of these hens are identified by breeders based on their experience. In order to address this issue, we propose a method that is widely applicable and highly precise. First, breeders themselves separate low-egg-production-laying hens and normal-laying hens. Then, under a halogen lamp, hyperspectral images of the two different types of hens are captured via hyperspectral imaging equipment. The vertex component analysis (VCA) algorithm is used to extract the cockscomb end member spectrum to obtain the cockscomb spectral feature curves of low-egg-production-laying hens and normal ones. Next, fast continuous wavelet transform (FCWT) is employed to analyze the data of the feature curves in order to obtain the two-dimensional spectral feature image dataset. Finally, referring to the two-dimensional spectral image dataset of the low-egg-production-laying hens and normal ones, we developed a deep learning model based on a convolutional neural network (CNN). When we tested the model's accuracy by using the prepared dataset, we found that it was 0.975 percent accurate. This outcome demonstrates our identification method, which combines hyperspectral imaging technology, an FCWT data analysis method, and a CNN deep learning model, and is highly effective and precise in laying-hen breeding plants. Furthermore, the attempt to use FCWT for the analysis and processing of hyperspectral data will have a significant impact on the research and application of hyperspectral technology in other fields due to its high efficiency and resolution characteristics for data signal analysis and processing.
Collapse
Affiliation(s)
- Xing Qin
- Zhejiang Key Laboratory of Large-Scale Integrated Circuit Design, Hangzhou Dianzi University, Hangzhou 310018, China
| | - Chenxiao Lai
- Zhejiang Key Laboratory of Large-Scale Integrated Circuit Design, Hangzhou Dianzi University, Hangzhou 310018, China
| | - Zejun Pan
- Zhejiang Key Laboratory of Large-Scale Integrated Circuit Design, Hangzhou Dianzi University, Hangzhou 310018, China
| | - Mingzhong Pan
- Key Laboratory of Gravitational Wave Precision Measurement of Zhejiang Province, School of Physics and Photoelectric Engineering, Hangzhou Institute for Advanced Study, University of Chinese Academy of Sciences, Hangzhou 310024, China
| | - Yun Xiang
- Agriculture Science Research Institute, Jinhua 321000, China
| | - Yikun Wang
- Key Laboratory of Gravitational Wave Precision Measurement of Zhejiang Province, School of Physics and Photoelectric Engineering, Hangzhou Institute for Advanced Study, University of Chinese Academy of Sciences, Hangzhou 310024, China
| |
Collapse
|
4
|
Subedi S, Bist R, Yang X, Chai L. Tracking Floor Eggs with Machine Vision in Cage-free Hen Houses. Poult Sci 2023; 102:102637. [PMID: 37011469 PMCID: PMC10090712 DOI: 10.1016/j.psj.2023.102637] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2023] [Revised: 02/28/2023] [Accepted: 03/02/2023] [Indexed: 03/11/2023] Open
Abstract
Some of the major restaurants and grocery chains in the United States have pledged to buy cage-free (CF) eggs only by 2025 or 2030. While CF house allows hens to perform more natural behaviors (e.g., dust bathing, perching, and foraging on the litter floor), a particular challenge is floor eggs (i.e., mislaid eggs on litter floor). Floor eggs have high chances of contamination. The manual collection of eggs is laborious and time-consuming. Therefore, precision poultry farming technology is necessary to detect floor eggs. In this study, 3 new deep learning models, that is, YOLOv5s-egg, YOLOv5x-egg, and YOLOv7-egg networks, were developed, trained, and compared in tracking floor eggs in 4 research cage-free laying hen facilities. Models were verified to detect eggs by using images collected in 2 different commercial houses. Results indicate that the YOLOv5s-egg model detected floor eggs with a precision of 87.9%, recall of 86.8%, and mean average precision (mAP) of 90.9%; the YOLOv5x-egg model detected the floor eggs with a precision of 90%, recall of 87.9%, and mAP of 92.1%; and the YOLOv7-egg model detected the eggs with a precision of 89.5%, recall of 85.4%, and mAP of 88%. All models performed with over 85% detection precision; however, model performance is affected by the stocking density, varying light intensity, and images occluded by equipment like drinking lines, perches, and feeders. The YOLOv5x-egg model detected floor eggs with higher accuracy, precision, mAP, and recall than YOLOv5s-egg and YOLOv7-egg. This study provides a reference for cage-free producers that floor eggs can be monitored automatically. Future studies are guaranteed to test the system in commercial houses.
Collapse
Affiliation(s)
- Sachin Subedi
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Ramesh Bist
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Xiao Yang
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Lilong Chai
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA.
| |
Collapse
|
5
|
Keypoint Detection for Injury Identification during Turkey Husbandry Using Neural Networks. SENSORS 2022; 22:s22145188. [PMID: 35890870 PMCID: PMC9319281 DOI: 10.3390/s22145188] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Revised: 06/24/2022] [Accepted: 07/09/2022] [Indexed: 02/05/2023]
Abstract
Injurious pecking against conspecifics is a serious problem in turkey husbandry. Bloody injuries act as a trigger mechanism to induce further pecking, and timely detection and intervention can prevent massive animal welfare impairments and costly losses. Thus, the overarching aim is to develop a camera-based system to monitor the flock and detect injuries using neural networks. In a preliminary study, images of turkeys were annotated by labelling potential injuries. These were used to train a network for injury detection. Here, we applied a keypoint detection model to provide more information on animal position and indicate injury location. Therefore, seven turkey keypoints were defined, and 244 images (showing 7660 birds) were manually annotated. Two state-of-the-art approaches for pose estimation were adjusted, and their results were compared. Subsequently, a better keypoint detection model (HRNet-W48) was combined with the segmentation model for injury detection. For example, individual injuries were classified using “near tail” or “near head” labels. Summarizing, the keypoint detection showed good results and could clearly differentiate between individual animals even in crowded situations.
Collapse
|