1
|
Yu H, Che M, Yu H, Ma Y. Research on weed identification in soybean fields based on the lightweight segmentation model DCSAnet. Front Plant Sci 2023; 14:1268218. [PMID: 38116146 PMCID: PMC10728600 DOI: 10.3389/fpls.2023.1268218] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/27/2023] [Accepted: 11/08/2023] [Indexed: 12/21/2023]
Abstract
Weeds can compete with crops for sunlight, water, space and various nutrients, which can affect the growth of crops.In recent years, people have started to use self-driving agricultural equipment, robots, etc. for weeding work and use of drones for weed identification and spraying of weeds with herbicides, and the effectiveness of these mobile weeding devices is largely limited by the superiority of weed detection capability. To improve the weed detection capability of mobile weed control devices, this paper proposes a lightweight weed segmentation network model DCSAnet that can be better applied to mobile weed control devices. The whole network model uses an encoder-decoder structure and the DCA module as the main feature extraction module. The main body of the DCA module is based on the reverse residual structure of MobileNetV3, effectively combines asymmetric convolution and depthwise separable convolution, and uses a channel shuffle strategy to increase the randomness of feature extraction. In the decoding stage, feature fusion utilizes the high-dimensional feature map to guide the aggregation of low-dimensional feature maps to reduce feature loss during fusion and increase the accuracy of the model. To validate the performance of this network model on the weed segmentation task, we collected a soybean field weed dataset containing a large number of weeds and crops and used this dataset to conduct an experimental study of DCSAnet. The results showed that our proposed DCSAnet achieves an MIoU of 85.95% with a model parameter number of 0.57 M and the highest segmentation accuracy in comparison with other lightweight networks, which demonstrates the effectiveness of the model for the weed segmentation task.
Collapse
Affiliation(s)
- Helong Yu
- College of Information Technology, Jilin Agricultural University, Changchun, China
| | - Minghang Che
- College of Information Technology, Jilin Agricultural University, Changchun, China
| | - Han Yu
- College of Information Technology, Jilin Agricultural University, Changchun, China
| | - Yuntao Ma
- College of Land Science and Technology, China Agricultural University, Beijing, China
| |
Collapse
|
3
|
Guo Z, Goh HH, Li X, Zhang M, Li Y. WeedNet-R: a sugar beet field weed detection algorithm based on enhanced RetinaNet and context semantic fusion. Front Plant Sci 2023; 14:1226329. [PMID: 37560032 PMCID: PMC10408303 DOI: 10.3389/fpls.2023.1226329] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/21/2023] [Accepted: 06/26/2023] [Indexed: 08/11/2023]
Abstract
Accurate and dependable weed detection technology is a prerequisite for weed control robots to do autonomous weeding. Due to the complexity of the farmland environment and the resemblance between crops and weeds, detecting weeds in the field under natural settings is a difficult task. Existing deep learning-based weed detection approaches often suffer from issues such as monotonous detection scene, lack of picture samples and location information for detected items, low detection accuracy, etc. as compared to conventional weed detection methods. To address these issues, WeedNet-R, a vision-based network for weed identification and localization in sugar beet fields, is proposed. WeedNet-R adds numerous context modules to RetinaNet's neck in order to combine context information from many feature maps and so expand the effective receptive fields of the entire network. During model training, meantime, a learning rate adjustment method combining an untuned exponential warmup schedule and cosine annealing technique is implemented. As a result, the suggested method for weed detection is more accurate without requiring a considerable increase in model parameters. The WeedNet-R was trained and assessed using the OD-SugarBeets dataset, which is enhanced by manually adding the bounding box labels based on the publicly available agricultural dataset, i.e. SugarBeet2016. Compared to the original RetinaNet, the mAP of the proposed WeedNet-R increased in the weed detection job in sugar beet fields by 4.65% to 92.30%. WeedNet-R's average precision for weed and sugar beet is 85.70% and 98.89%, respectively. WeedNet-R outperforms other sophisticated object detection algorithms in terms of detection accuracy while matching other single-stage detectors in terms of detection speed.
Collapse
Affiliation(s)
- Zhiqiang Guo
- School of Electrical Engineering, Guangxi University, Nanning, China
| | - Hui Hwang Goh
- School of Electrical Engineering, Guangxi University, Nanning, China
| | - Xiuhua Li
- School of Electrical Engineering, Guangxi University, Nanning, China
- Guangxi Key Laboratory of Sugarcane Biology, Guangxi University, Nanning, China
| | - Muqing Zhang
- Guangxi Key Laboratory of Sugarcane Biology, Guangxi University, Nanning, China
| | - Yong Li
- School of Electrical Engineering, Guangxi University, Nanning, China
| |
Collapse
|
4
|
Mu Y, Ni R, Fu L, Luo T, Feng R, Li J, Pan H, Wang Y, Sun Y, Gong H, Guo Y, Hu T, Bao Y, Li S. DenseNet weed recognition model combining local variance preprocessing and attention mechanism. Front Plant Sci 2023; 13:1041510. [PMID: 36714726 PMCID: PMC9877626 DOI: 10.3389/fpls.2022.1041510] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/11/2022] [Accepted: 12/13/2022] [Indexed: 06/18/2023]
Abstract
INTRODUCTION The purpose of this paper is to effectively and accurately identify weed species in crop fields in complex environments. There are many kinds of weeds in the detection area, which are densely distributed. METHODS The paper proposes the use of local variance pre-processing method for background segmentation and data enhancement, which effectively removes the complex background and redundant information from the data, and prevents the experiment from overfitting, which can improve the accuracy rate significantly. Then, based on the optimization improvement of DenseNet network, Efficient Channel Attention (ECA) mechanism is introduced after the convolutional layer to increase the weight of important features, strengthen the weed features and suppress the background features. RESULTS Using the processed images to train the model, the accuracy rate reaches 97.98%, which is a great improvement, and the comprehensive performance is higher than that of DenseNet, VGGNet-16, VGGNet-19, ResNet-50, DANet, DNANet, and U-Net models. DISCUSSION The experimental data show that the model and method we designed are well suited to solve the problem of accurate identification of crop and weed species in complex environments, laying a solid technical foundation for the development of intelligent weeding robots.
Collapse
Affiliation(s)
- Ye Mu
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Ruiwen Ni
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Lili Fu
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Tianye Luo
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Ruilong Feng
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Ji Li
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Haohong Pan
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Yingkai Wang
- Faculty of Agronomy, Jilin Agricultural University, Changchun, China
| | - Yu Sun
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - He Gong
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Ying Guo
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Tianli Hu
- College of Information Technology, Jilin Agricultural University, Changchun, China
- Jilin Province Agricultural Internet of Things Technology Collaborative Innovation Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Intelligent Environmental Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
- Jilin Province Information Technology and Intelligent Agriculture Engineering Research Center, Jilin Agricultural University, Changchun, Jilin, China
| | - Yu Bao
- School of Management, Changchun University, Changchun, China
| | - Shijun Li
- College of Information Technology, Wuzhou University, Wuzhou, China
- Guangxi Key Laboratory of Machine Vision and Intelligent Control, Wuzhou University, Wuzhou, Guangxi, China
| |
Collapse
|
5
|
Zhu H, Zhang Y, Mu D, Bai L, Zhuang H, Li H. YOLOX-based blue laser weeding robot in corn field. Front Plant Sci 2022; 13:1017803. [PMID: 36407588 PMCID: PMC9674089 DOI: 10.3389/fpls.2022.1017803] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Accepted: 09/05/2022] [Indexed: 06/16/2023]
Abstract
A YOLOX convolutional neural network-based weeding robot was designed for weed removal in corn seedling fields, while verifying the feasibility of a blue light laser as a non-contact weeding tool. The robot includes a tracked mobile platform module, a weed identification module, and a robotic arm laser emitter module. Five-degree-of-freedom robotic arm designed according to the actual weeding operation requirements to achieve precise alignment of the laser. When the robot is in operation, it uses the texture and shape of the plants to differentiate between weeds and corn seedlings. The robot then uses monocular ranging to calculate the coordinates of the weeds using the triangle similarity principle, and it controls the end actuator of the robotic arm to emit the laser to kill the weeds. At a driving speed of 0.2 m·s-1 on flat ground, the weed robot's average detection rate for corn seedlings and weeds was 92.45% and 88.94%, respectively. The average weed dry weight prevention efficacy was 85%, and the average seedling injury rate was 4.68%. The results show that the robot can accurately detect weeds in corn fields, and the robotic arm can precisely align the weed position and the blue light laser is effective in removing weeds.
Collapse
Affiliation(s)
- Huibin Zhu
- College of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming, China
| | - Yuanyuan Zhang
- College of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming, China
| | - Danlei Mu
- College of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming, China
| | - Lizhen Bai
- College of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming, China
| | - Hao Zhuang
- College of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming, China
| | - Hui Li
- Shandong Academy of Agricultural Machinery Science, Jinan, China
| |
Collapse
|
6
|
Yu H, Men Z, Bi C, Liu H. Research on Field Soybean Weed Identification Based on an Improved UNet Model Combined With a Channel Attention Mechanism. Front Plant Sci 2022; 13:890051. [PMID: 35783959 PMCID: PMC9240479 DOI: 10.3389/fpls.2022.890051] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/05/2022] [Accepted: 05/16/2022] [Indexed: 05/31/2023]
Abstract
Aiming at the problem that it is difficult to identify two types of weeds, grass weeds and broadleaf weeds, in complex field environments, this paper proposes a semantic segmentation method with an improved UNet structure and an embedded channel attention mechanism SE module. First, to eliminate the semantic gap between low-dimensional semantic features and high-dimensional semantic features, the UNet model structure is modified according to the characteristics of different types of weeds, and the feature maps after the first five down sampling tasks are restored to the same original image through the deconvolution layer. Hence, the final feature map used for prediction is obtained by the fusion of the upsampling feature map and the feature maps containing more low-dimensional semantic information in the first five layers. In addition, ResNet34 is used as the backbone network, and the channel attention mechanism SE module is embedded to improve useful features. The channel weight is determined, noise is suppressed, soybean and grass weeds are identified, and broadleaf weeds are extracted through digital image morphological processing, and segmented images of soybean plants, grass weeds and broadleaf weeds are generated. Moreover, compared with the standard semantic segmentation models, FCN, UNet, and SegNet, the experimental results show that the overall performance of the model in this paper is the best. The average intersection ratio and average pixel recognition rate in a complex field environment are 0.9282 and 96.11%, respectively. On the basis of weed classification, the identified weeds are further refined into two types of weeds to provide technical support for intelligent precision variable weed spraying.
Collapse
Affiliation(s)
- Helong Yu
- College of Information Technology, Jilin Agricultural University, Changchun, China
| | - Zhibo Men
- College of Information Technology, Jilin Agricultural University, Changchun, China
| | - Chunguang Bi
- College of Information Technology, Jilin Agricultural University, Changchun, China
| | - Huanjun Liu
- Northeast Institute of Geography and Agroecology, Chinese Academy of Sciences, Changchun, China
| |
Collapse
|