1
|
Fryčák T, Fürst T, Koprna R, Špíšek Z, Miřijovský J, Humplík JF. Crop growth dynamics: Fast automatic analysis of LiDAR images in field-plot experiments by specialized software ALFA. PLoS One 2024; 19:e0297153. [PMID: 38236942 PMCID: PMC10796001 DOI: 10.1371/journal.pone.0297153] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 12/28/2023] [Indexed: 01/22/2024] Open
Abstract
Repeated measurements of crop height to observe plant growth dynamics in real field conditions represent a challenging task. Although there are ways to collect data using sensors on UAV systems, proper data processing and analysis are the key to reliable results. As there is need for specialized software solutions for agricultural research and breeding purposes, we present here a fast algorithm ALFA for the processing of UAV LiDAR derived point-clouds to extract the information on crop height at many individual cereal field-plots at multiple time points. Seven scanning flights were performed over 3 blocks of experimental barley field plots between April and June 2021. Resulting point-clouds were processed by the new algorithm ALFA. The software converts point-cloud data into a digital image and extracts the traits of interest-the median crop height at individual field plots. The entire analysis of 144 field plots of dimension 80 x 33 meters measured at 7 time points (approx. 100 million LiDAR points) takes about 3 minutes at a standard PC. The Root Mean Square Deviation of the software-computed crop height from the manual measurement is 5.7 cm. Logistic growth model is fitted to the measured data by means of nonlinear regression. Three different ways of crop-height data visualization are provided by the software to enable further analysis of the variability in growth parameters. We show that the presented software solution is a fast and reliable tool for automatic extraction of plant height from LiDAR images of individual field-plots. We offer this tool freely to the scientific community for non-commercial use.
Collapse
Affiliation(s)
- Tadeáš Fryčák
- Department of Mathematical Analysis and Applications of Mathematics, Faculty of Science, Palacký University, Olomouc, Czech Republic
| | - Tomáš Fürst
- Department of Mathematical Analysis and Applications of Mathematics, Faculty of Science, Palacký University, Olomouc, Czech Republic
| | - Radoslav Koprna
- Department of Chemical Biology, Faculty of Science, Palacký University, Olomouc, Czech Republic
| | - Zdeněk Špíšek
- Department of Chemical Biology, Faculty of Science, Palacký University, Olomouc, Czech Republic
| | - Jakub Miřijovský
- Department of Geoinformatics, Faculty of Science, Palacký University, Olomouc, Czech Republic
| | - Jan F. Humplík
- Department of Chemical Biology, Faculty of Science, Palacký University, Olomouc, Czech Republic
| |
Collapse
|
2
|
Bartmiński P, Siłuch M, Kociuba W. The Effectiveness of a UAV-Based LiDAR Survey to Develop Digital Terrain Models and Topographic Texture Analyses. SENSORS (BASEL, SWITZERLAND) 2023; 23:6415. [PMID: 37514709 PMCID: PMC10383832 DOI: 10.3390/s23146415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/15/2023] [Revised: 07/12/2023] [Accepted: 07/13/2023] [Indexed: 07/30/2023]
Abstract
This study presents a comparison of data acquired from three LiDAR sensors from different manufacturers, i.e., Yellow Scan Mapper (YSM), AlphaAir 450 Airborne LiDAR System CHC Navigation (CHC) and DJI Zenmuse L1 (L1). The same area was surveyed with laser sensors mounted on the DIJ Matrice 300 RTK UAV platform. In order to compare the data, a diverse test area located in the north-western part of the Lublin Province in eastern Poland was selected. The test area was a gully system with high vegetation cover. In order to compare the UAV information, LiDAR reference data were used, which were collected within the ISOK project (acquired for the whole area of Poland). In order to examine the differentiation of the acquired data, both classified point clouds and DTM products calculated on the basis of point clouds acquired from individual sensors were compared. The analyses showed that the largest average height differences between terrain models calculated from point clouds were recorded between the CHC sensor and the base data, exceeding 2.5 m. The smallest differences were recorded between the L1 sensor and ISOK data-RMSE was 0.31 m. The use of UAVs to acquire very high resolution data can only be used locally and must be subject to very stringent landing site preparation procedures, as well as data processing in DTM and its derivatives.
Collapse
Affiliation(s)
- Piotr Bartmiński
- Institute of Earth and Environmental Sciences, Maria Curie-Skłodowska University in Lublin, 20-031 Lublin, Poland
| | - Marcin Siłuch
- Institute of Earth and Environmental Sciences, Maria Curie-Skłodowska University in Lublin, 20-031 Lublin, Poland
| | - Waldemar Kociuba
- Institute of Earth and Environmental Sciences, Maria Curie-Skłodowska University in Lublin, 20-031 Lublin, Poland
| |
Collapse
|
3
|
Zang J, Jin S, Zhang S, Li Q, Mu Y, Li Z, Li S, Wang X, Su Y, Jiang D. Field-measured canopy height may not be as accurate and heritable as believed: evidence from advanced 3D sensing. PLANT METHODS 2023; 19:39. [PMID: 37009892 PMCID: PMC10069135 DOI: 10.1186/s13007-023-01012-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/31/2022] [Accepted: 03/21/2023] [Indexed: 06/19/2023]
Abstract
Canopy height (CH) is an important trait for crop breeding and production. The rapid development of 3D sensing technologies shed new light on high-throughput height measurement. However, a systematic comparison of the accuracy and heritability of different 3D sensing technologies is seriously lacking. Moreover, it is questionable whether the field-measured height is as reliable as believed. This study uncovered these issues by comparing traditional height measurement with four advanced 3D sensing technologies, including terrestrial laser scanning (TLS), backpack laser scanning (BLS), gantry laser scanning (GLS), and digital aerial photogrammetry (DAP). A total of 1920 plots covering 120 varieties were selected for comparison. Cross-comparisons of different data sources were performed to evaluate their performances in CH estimation concerning different CH, leaf area index (LAI), and growth stage (GS) groups. Results showed that 1) All 3D sensing data sources had high correlations with field measurement (r > 0.82), while the correlations between different 3D sensing data sources were even better (r > 0.87). 2) The prediction accuracy between different data sources decreased in subgroups of CH, LAI, and GS. 3) Canopy height showed high heritability from all datasets, and 3D sensing datasets had even higher heritability (H2 = 0.79-0.89) than FM (field measurement) (H2 = 0.77). Finally, outliers of different datasets are analyzed. The results provide novel insights into different methods for canopy height measurement that may ensure the high-quality application of this important trait.
Collapse
Affiliation(s)
- Jingrong Zang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Shichao Jin
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China.
| | - Songyin Zhang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Qing Li
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Yue Mu
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Ziyu Li
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Shaochen Li
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Xiao Wang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| | - Yanjun Su
- State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing, 100093, China
| | - Dong Jiang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored By Province and Ministry, College of Agriculture, Nanjing Agricultural University, Nanjing, 210095, China
| |
Collapse
|
4
|
Forero MG, Murcia HF, Méndez D, Betancourt-Lozano J. LiDAR Platform for Acquisition of 3D Plant Phenotyping Database. PLANTS 2022; 11:plants11172199. [PMID: 36079580 PMCID: PMC9459957 DOI: 10.3390/plants11172199] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Revised: 07/26/2022] [Accepted: 08/10/2022] [Indexed: 11/16/2022]
Abstract
Currently, there are no free databases of 3D point clouds and images for seedling phenotyping. Therefore, this paper describes a platform for seedling scanning using 3D Lidar with which a database was acquired for use in plant phenotyping research. In total, 362 maize seedlings were recorded using an RGB camera and a SICK LMS4121R-13000 laser scanner with angular resolutions of 45° and 0.5° respectively. The scanned plants are diverse, with seedling captures ranging from less than 10 cm to 40 cm, and ranging from 7 to 24 days after planting in different light conditions in an indoor setting. The point clouds were processed to remove noise and imperfections with a mean absolute precision error of 0.03 cm, synchronized with the images, and time-stamped. The database includes the raw and processed data and manually assigned stem and leaf labels. As an example of a database application, a Random Forest classifier was employed to identify seedling parts based on morphological descriptors, with an accuracy of 89.41%.
Collapse
|
5
|
Illana Rico S, Martínez Gila DM, Cano Marchal P, Gómez Ortega J. Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground. SENSORS (BASEL, SWITZERLAND) 2022; 22:6219. [PMID: 36015987 PMCID: PMC9414240 DOI: 10.3390/s22166219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 06/28/2022] [Accepted: 08/02/2022] [Indexed: 06/15/2023]
Abstract
Marking the tree canopies is an unavoidable step in any study working with high-resolution aerial images taken by a UAV in any fruit tree crop, such as olive trees, as the extraction of pixel features from these canopies is the first step to build the models whose predictions are compared with the ground truth obtained by measurements made with other types of sensors. Marking these canopies manually is an arduous and tedious process that is replaced by automatic methods that rarely work well for groves with a thick plant cover on the ground. This paper develops a standard method for the detection of olive tree canopies from high-resolution aerial images taken by a multispectral camera, regardless of the plant cover density between canopies. The method is based on the relative spatial information between canopies.The planting pattern used by the grower is computed and extrapolated using Delaunay triangulation in order to fuse this knowledge with that previously obtained from spectral information. It is shown that the minimisation of a certain function provides an optimal fit of the parameters that define the marking of the trees, yielding promising results of 77.5% recall and 70.9% precision.
Collapse
Affiliation(s)
- Sergio Illana Rico
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
| | - Diego Manuel Martínez Gila
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
- Institute for Olive Orchards and Olive Oils, University of Jaén, 23071 Jaén, Spain
| | - Pablo Cano Marchal
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
- Institute for Olive Orchards and Olive Oils, University of Jaén, 23071 Jaén, Spain
| | - Juan Gómez Ortega
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
- Institute for Olive Orchards and Olive Oils, University of Jaén, 23071 Jaén, Spain
| |
Collapse
|
6
|
Shadow Removal from UAV Images Based on Color and Texture Equalization Compensation of Local Homogeneous Regions. REMOTE SENSING 2022. [DOI: 10.3390/rs14112616] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
Abstract
Due to imaging and lighting directions, shadows are inevitably formed in unmanned aerial vehicle (UAV) images. This causes shadowed regions with missed and occluded information, such as color and texture details. Shadow detection and compensation from remote sensing images is essential for recovering the missed information contained in these images. Current methods are mainly aimed at processing shadows with simple scenes. For UAV remote sensing images with a complex background and multiple shadows, problems inevitably occur, such as color distortion or texture information loss in the shadow compensation result. In this paper, we propose a novel shadow removal algorithm from UAV remote sensing images based on color and texture equalization compensation of local homogeneous regions. Firstly, the UAV imagery is split into blocks by selecting the size of the sliding window. The shadow was enhanced by a new shadow detection index (SDI) and threshold segmentation was applied to obtain the shadow mask. Then, the homogeneous regions are extracted with LiDAR intensity and elevation information. Finally, the information of the non-shadow objects of the homogeneous regions is used to restore the missed information in the shadow objects of the regions. The results revealed that the average overall accuracy of shadow detection is 98.23% and the average F1 score is 95.84%. The average color difference is 1.891, the average shadow standard deviation index is 15.419, and the average gradient similarity is 0.726. The results have shown that the proposed method performs well in both subjective and objective evaluations.
Collapse
|