1
|
Aguilar-Ariza A, Ishii M, Miyazaki T, Saito A, Khaing HP, Phoo HW, Kondo T, Fujiwara T, Guo W, Kamiya T. UAV-based individual Chinese cabbage weight prediction using multi-temporal data. Sci Rep 2023; 13:20122. [PMID: 37978327 PMCID: PMC10656565 DOI: 10.1038/s41598-023-47431-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Accepted: 11/14/2023] [Indexed: 11/19/2023] Open
Abstract
The use of unmanned aerial vehicles (UAVs) has facilitated crop canopy monitoring, enabling yield prediction by integrating regression models. However, the application of UAV-based data to individual-level harvest weight prediction is limited by the effectiveness of obtaining individual features. In this study, we propose a method that automatically detects and extracts multitemporal individual plant features derived from UAV-based data to predict harvest weight. We acquired data from an experimental field sown with 1196 Chinese cabbage plants, using two cameras (RGB and multi-spectral) mounted on UAVs. First, we used three RGB orthomosaic images and an object detection algorithm to detect more than 95% of the individual plants. Next, we used feature selection methods and five different multi-temporal resolutions to predict individual plant weights, achieving a coefficient of determination (R2) of 0.86 and a root mean square error (RMSE) of 436 g/plant. Furthermore, we achieved predictions with an R2 greater than 0.72 and an RMSE less than 560 g/plant up to 53 days prior to harvest. These results demonstrate the feasibility of accurately predicting individual Chinese cabbage harvest weight using UAV-based data and the efficacy of utilizing multi-temporal features to predict plant weight more than one month prior to harvest.
Collapse
Affiliation(s)
- Andrés Aguilar-Ariza
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Yayoi, Bunkyo-ku, Tokyo, 113-8657, Japan
| | - Masanori Ishii
- Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Midoricho, Nishitokyo-shi, Tokyo, 188-0002, Japan
| | - Toshio Miyazaki
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | - Aika Saito
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | | | - Hnin Wint Phoo
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | - Tomohiro Kondo
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | - Toru Fujiwara
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Yayoi, Bunkyo-ku, Tokyo, 113-8657, Japan
| | - Wei Guo
- Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Midoricho, Nishitokyo-shi, Tokyo, 188-0002, Japan
| | - Takehiro Kamiya
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Yayoi, Bunkyo-ku, Tokyo, 113-8657, Japan.
| |
Collapse
|
2
|
Abebe AM, Kim Y, Kim J, Kim SL, Baek J. Image-Based High-Throughput Phenotyping in Horticultural Crops. PLANTS (BASEL, SWITZERLAND) 2023; 12:2061. [PMID: 37653978 PMCID: PMC10222289 DOI: 10.3390/plants12102061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Revised: 05/12/2023] [Accepted: 05/18/2023] [Indexed: 09/02/2023]
Abstract
Plant phenotyping is the primary task of any plant breeding program, and accurate measurement of plant traits is essential to select genotypes with better quality, high yield, and climate resilience. The majority of currently used phenotyping techniques are destructive and time-consuming. Recently, the development of various sensors and imaging platforms for rapid and efficient quantitative measurement of plant traits has become the mainstream approach in plant phenotyping studies. Here, we reviewed the trends of image-based high-throughput phenotyping methods applied to horticultural crops. High-throughput phenotyping is carried out using various types of imaging platforms developed for indoor or field conditions. We highlighted the applications of different imaging platforms in the horticulture sector with their advantages and limitations. Furthermore, the principles and applications of commonly used imaging techniques, visible light (RGB) imaging, thermal imaging, chlorophyll fluorescence, hyperspectral imaging, and tomographic imaging for high-throughput plant phenotyping, are discussed. High-throughput phenotyping has been widely used for phenotyping various horticultural traits, which can be morphological, physiological, biochemical, yield, biotic, and abiotic stress responses. Moreover, the ability of high-throughput phenotyping with the help of various optical sensors will lead to the discovery of new phenotypic traits which need to be explored in the future. We summarized the applications of image analysis for the quantitative evaluation of various traits with several examples of horticultural crops in the literature. Finally, we summarized the current trend of high-throughput phenotyping in horticultural crops and highlighted future perspectives.
Collapse
Affiliation(s)
| | | | | | | | - Jeongho Baek
- Department of Agricultural Biotechnology, National Institute of Agricultural Science, Rural Development Administration, Jeonju 54874, Republic of Korea
| |
Collapse
|
3
|
Jang G, Kim DW, Park WP, Kim HJ, Chung YS. Heterogeneity Assessment of Kenaf Breeding Field through Spatial Dependence Analysis on Crop Growth Status Map Derived by Unmanned Aerial Vehicle. PLANTS (BASEL, SWITZERLAND) 2023; 12:1638. [PMID: 37111861 PMCID: PMC10144067 DOI: 10.3390/plants12081638] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/12/2023] [Revised: 04/11/2023] [Accepted: 04/11/2023] [Indexed: 06/19/2023]
Abstract
The investigation of quantitative phenotypic traits resulting from the interaction between targeted genotypic traits and environmental factors is essential for breeding selection. Therefore, plot-wise controlled environmental factors must be invariable for accurate identification of phenotypes. However, the assumption of homogeneous variables within the open-field is not always accepted, and requires a spatial dependence analysis to determine whether site-specific environmental factors exist. In this study, spatial dependence within the kenaf breeding field was assessed in a geo-tagged height map derived from an unmanned aerial vehicle (UAV). Local indicators of spatial autocorrelation (LISA) were applied to the height map using Geoda software, and the LISA map was generated in order to recognize the existence of kenaf height status clusters. The spatial dependence of the breeding field used in this study appeared in a specific region. The cluster pattern was similar to the terrain elevation pattern of this field and highly correlated with drainage capacity. The cluster pattern could be utilized to design random blocks based on regions that have similar spatial dependence. We confirmed the potential of spatial dependence analysis on a crop growth status map, derived by UAV, for breeding strategy design with a tight budget.
Collapse
Affiliation(s)
- Gyujin Jang
- Department of Biosystems Engineering, Seoul National University, Seoul 08826, Republic of Korea
- Integrated Major in Global Smart Farm, Seoul National University, Seoul 08826, Republic of Korea
| | - Dong-Wook Kim
- Department of Biosystems Engineering, Seoul National University, Seoul 08826, Republic of Korea
| | - Won-Pyo Park
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Republic of Korea
| | - Hak-Jin Kim
- Department of Biosystems Engineering, Seoul National University, Seoul 08826, Republic of Korea
- BrainKorea21 Global Smart Farm Educational Research Center, Seoul National University, Seoul 08826, Republic of Korea
| | - Yong-Suk Chung
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Republic of Korea
- Bio-Resources and Computing Research Center, Jeju National University, Jeju 63243, Republic of Korea
| |
Collapse
|
4
|
Ventura D, Napoleone F, Cannucci S, Alleaume S, Valentini E, Casoli E, Burrascano S. Integrating low-altitude drone based-imagery and OBIA for mapping and manage semi natural grassland habitats. JOURNAL OF ENVIRONMENTAL MANAGEMENT 2022; 321:115723. [PMID: 35994965 DOI: 10.1016/j.jenvman.2022.115723] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Revised: 06/25/2022] [Accepted: 07/09/2022] [Indexed: 06/15/2023]
Abstract
Monitoring semi-natural grasslands is difficult and costly because they occur in highly dynamic and extremely complex habitat mosaics. We combined the use of a low-cost unmanned aerial vehicle (UAV) and Structure from Motion (SfM) photogrammetry to acquire high spatial resolution (∼1.5 cm pixel) RGB imagery. After image classification through Object-Based Image Analysis (OBIA), we accurately were able to distinguish three semi-natural grassland types, one of which is a habitat of conservation concern. The use of orthomosaics, digital elevation models (DEMs), and canopy height models (CHMs) yielded excellent overall classification accuracies (>89%) assessed through both remotely validated and ground-truthed points. We identified two layers of woody vegetation with a user's (UA) and producer's (PA) accuracies >73% and three grassland types: closed grassland (UA = 94%; PA = 97%), open grassland habitat (UA = 97%; PA = 93%) and open grasslands with soil erosion (UA = 96%; PA = 98%). The grassland types differed substantially in the cover of vegetation, rocks, stones, and bare soil measured in the field, as well as in the number and relative cover of the habitat diagnostic species. The proposed methodology is highly promising for mapping and monitoring semi-natural grassland of conservation concern in support of tailored management actions.
Collapse
Affiliation(s)
- Daniele Ventura
- Department of Environmental Biology, Sapienza University of Rome, P.le Aldo Moro 5, 00185, Rome, Italy
| | - Francesca Napoleone
- Department of Environmental Biology, Sapienza University of Rome, P.le Aldo Moro 5, 00185, Rome, Italy.
| | - Silvia Cannucci
- Department of Life Sciences, University of Siena, Via P.A. Mattioli 4, 53100, Siena, Italy
| | - Samuel Alleaume
- UMR TETIS, Univ. Montpellier, AgroParisTech, CIRAD, CNRS, INRAE, Montpellier, France
| | - Emiliana Valentini
- Institute of Polar Sciences of the Italian National Research Council (ISP - CNR), Rome, Italy
| | - Edoardo Casoli
- Department of Environmental Biology, Sapienza University of Rome, P.le Aldo Moro 5, 00185, Rome, Italy
| | - Sabina Burrascano
- Department of Environmental Biology, Sapienza University of Rome, P.le Aldo Moro 5, 00185, Rome, Italy
| |
Collapse
|
5
|
Real-Time Georeferencing of Fire Front Aerial Images Using Iterative Ray-Tracing and the Bearings-Range Extended Kalman Filter. SENSORS 2022; 22:s22031150. [PMID: 35161894 PMCID: PMC8838670 DOI: 10.3390/s22031150] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/03/2021] [Revised: 01/07/2022] [Accepted: 01/11/2022] [Indexed: 02/01/2023]
Abstract
Although Aerial Vehicle images are a viable tool for observing large-scale patterns of fires and their impacts, its application is limited by the complex optical georeferencing procedure due to the lack of distinctive visual features in forest environments. For this reason, an exploratory study on rough and flat terrains was conducted to use and validate the Iterative Ray-Tracing method in combination with a Bearings-Range Extended Kalman Filter as a real-time forest fire georeferencing and filtering algorithm on images captured by an aerial vehicle. The Iterative Ray-Tracing method requires a vehicle equipped with a Global Positioning System (GPS), an Inertial Measurement Unit (IMU), a calibrated camera, and a Digital Elevation Map (DEM). The proposed method receives the real-time input of the GPS, IMU, and the image coordinates of the pixels to georeference (computed by a companion algorithm of fire front detection) and outputs the geographical coordinates corresponding to those pixels. The Unscented Transform B is proposed to characterize the Iterative Ray-Tracing uncertainty. A Bearings-Range filter measurement model is introduced in a sequential filtering architecture to reduce the noise in the measurements, assuming static targets. A performance comparison is done between the Bearings-Only and the Bearings-Range observation models, and between the Extended and Cubature Kalman Filters. In simulation studies with ground truth, without filtering we obtained a georeferencing Root Mean Squared Errors (RMSE) of 30.7 and 43.4 m for the rough and flat terrains respectively, while filtering with the proposed Bearings-Range Extended Kalman Filter showed the best results by reducing the previous RMSE to 11.7 and 19.8 m, respectively. In addition, the comparison of both filter algorithms showed a good performance of Bearings-Range filter which was slightly faster. Indeed, these experiments based on the real data conducted to results demonstrated the applicability of the proposed methodology for the real-time georeferencing forest fires.
Collapse
|
6
|
Accuracy Assessment of a UAV Direct Georeferencing Method and Impact of the Configuration of Ground Control Points. DRONES 2022. [DOI: 10.3390/drones6020030] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Unmanned aerial vehicles (UAVs) can obtain high-resolution topography data flexibly and efficiently at low cost. However, the georeferencing process involves the use of ground control points (GCPs), which limits time and cost effectiveness. Direct georeferencing, using onboard positioning sensors, can significantly improve work efficiency. The purpose of this study was to evaluate the accuracy of the Global Navigation Satellite System (GNSS)-assisted UAV direct georeferencing method and the influence of the number and distribution of GCPs. A FEIMA D2000 UAV was used to collect data, and several photogrammetric projects were established. Among them, the number and distribution of GCPs used in the bundle adjustment (BA) process were varied. Two parameters were considered when evaluating the different projects: the ground-measured checkpoints (CPs) root mean square error (RMSE) and the Multiscale Model to Model Cloud Comparison (M3C2) distance. The results show that the vertical and horizontal RMSE of the direct georeferencing were 0.087 and 0.041 m, respectively. As the number of GCPs increased, the RMSE gradually decreased until a specific GCP density was reached. GCPs should be uniformly distributed in the study area and contain at least one GCP near the center of the domain. Additionally, as the distance to the nearest GCP increased, the local accuracy of the DSM decreased. In general, UAV direct georeferencing has an acceptable positional accuracy level.
Collapse
|
7
|
Rice Height Monitoring between Different Estimation Models Using UAV Photogrammetry and Multispectral Technology. REMOTE SENSING 2021. [DOI: 10.3390/rs14010078] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Unmanned aerial vehicle (UAV) photogrammetry was used to monitor crop height in a flooded paddy field. Three multi-rotor UAVs were utilized to conduct flight missions in order to capture RGB (RedGreenBlue) and multispectral images, and these images were analyzed using several different models to provide the best results. Two image sets taken by two UAVs, mounted with RGB cameras of the same resolution and Global Navigation Satellite System (GNSS) receivers of different accuracies, were applied to perform photogrammetry. Two methods were then proposed for creating crop height models (CHMs), one of which was denoted as the M1 method and was based on the Digital Surface Point Cloud (DSPC) and the Digital Terrain Point Cloud (DSPT). The other was denoted as the M2 method and was based on the DSPC and a bathymetric sensor. An image set taken by another UAV mounted with a multispectral camera was used for multispectral-based photogrammetry. A Normal Differential Vegetation Index (NDVI) and a Vegetation Fraction (VF) were then extracted. A new method based on multiple linear regression (MLR) combining the NDVI, the VF, and a Soil Plant Analysis Development (SPAD) value for estimating the measured height (MH) of rice was then proposed and denoted as the M3 method. The results show that the M1 method, the UAV with a GNSS receiver with a higher accuracy, obtained more reliable estimations, while the M2 method, the UAV with a GNSS receiver of moderate accuracy, was actually slightly better. The effect on the performance of CHMs created by the M1 and M2 methods is more negligible in different plots with different treatments; however, remarkably, the more uniform the distribution of vegetation over the water surface, the better the performance. The M3 method, which was created using only a SPAD value and a canopy NDVI value, showed the highest coefficient of determination (R2) for overall MH estimation, 0.838, compared with other combinations.
Collapse
|
8
|
Luo Y, Zeng Z, Lu H, Lv E. Posture Detection of Individual Pigs Based on Lightweight Convolution Neural Networks and Efficient Channel-Wise Attention. SENSORS 2021; 21:s21248369. [PMID: 34960477 PMCID: PMC8705977 DOI: 10.3390/s21248369] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/18/2021] [Revised: 12/04/2021] [Accepted: 12/11/2021] [Indexed: 11/16/2022]
Abstract
In this paper, a lightweight channel-wise attention model is proposed for the real-time detection of five representative pig postures: standing, lying on the belly, lying on the side, sitting, and mounting. An optimized compressed block with symmetrical structure is proposed based on model structure and parameter statistics, and the efficient channel attention modules are considered as a channel-wise mechanism to improve the model architecture.The results show that the algorithm’s average precision in detecting standing, lying on the belly, lying on the side, sitting, and mounting is 97.7%, 95.2%, 95.7%, 87.5%, and 84.1%, respectively, and the speed of inference is around 63 ms (CPU = i7, RAM = 8G) per postures image. Compared with state-of-the-art models (ResNet50, Darknet53, CSPDarknet53, MobileNetV3-Large, and MobileNetV3-Small), the proposed model has fewer model parameters and lower computation complexity. The statistical results of the postures (with continuous 24 h monitoring) show that some pigs will eat in the early morning, and the peak of the pig’s feeding appears after the input of new feed, which reflects the health of the pig herd for farmers.
Collapse
Affiliation(s)
- Yizhi Luo
- College of Engineering, South China Agricultural University, Guangzhou 510642, China; (Y.L.); (E.L.)
| | - Zhixiong Zeng
- College of Engineering, South China Agricultural University, Guangzhou 510642, China; (Y.L.); (E.L.)
- Correspondence: ; Tel.: +86-20-8528-2860
| | - Huazhong Lu
- Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China;
| | - Enli Lv
- College of Engineering, South China Agricultural University, Guangzhou 510642, China; (Y.L.); (E.L.)
| |
Collapse
|
9
|
Tripathi P, Abdullah JS, Kim J, Chung YS, Kim SH, Hamayun M, Kim Y. Investigation of Root Morphological Traits Using 2D-Imaging among Diverse Soybeans (Glycine max L.). PLANTS 2021; 10:plants10112535. [PMID: 34834897 PMCID: PMC8622990 DOI: 10.3390/plants10112535] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/03/2021] [Revised: 11/03/2021] [Accepted: 11/19/2021] [Indexed: 11/20/2022]
Abstract
Roots are the most important plant organ for absorbing essential elements, such as water and nutrients for living. To develop new climate-resilient soybean cultivars, it is essential to know the variation in root morphological traits (RMT) among diverse soybean for selecting superior root attribute genotypes. However, information on root morphological characteristics is poorly understood due to difficulty in root data collection and visualization. Thus, to overcome this problem in root research, we used a 2-dimensional (2D) root image in identifying RMT among diverse soybeans in this research. We assessed RMT in the vegetative growth stage (V2) of 372 soybean cultivars propagated in polyvinyl chloride pipes. The phenotypic investigation revealed significant variability among the 372 soybean cultivars for RMT. In particular, RMT such as the average diameter (AD), surface area (SA), link average length (LAL), and link average diameter (LAD) showed significant variability. On the contrary RMT, as with total length (TL) and link average branching angle (LABA), did not show differences. Furthermore, in the distribution analysis, normal distribution was observed for all RMT; at the same time, difference was observed in the distribution curve depending on individual RMT. Thus, based on overall RMT analysis values, the top 5% and bottom 5% ranked genotypes were selected. Furthermore, genotypes that showed most consistent for overall RMT have ranked accordingly. This ultimately helps to identify four genotypes (IT 16538, IT 199127, IT 165432, IT 165282) ranked in the highest 5%, whereas nine genotypes (IT 23305, IT 208266, IT 165208, IT 156289, IT 165405, IT 165019, IT 165839, IT 203565, IT 181034) ranked in the lowest 5% for RMT. Moreover, principal component analysis clustered cultivar 2, cultivar 160, and cultivar 274 into one group with high RMT values, and cultivar 335, cultivar 40, and cultivar 249 with low RMT values. The RMT correlation results revealed significantly positive TL and AD correlations with SA (r = 0.96) and LAD (r = 0.85), respectively. However, negative correlations (r = −0.43) were observed between TL and AD. Similarly, AD showed a negative correlation (r = −0.22) with SA. Thus, this result suggests that TL is a more vital factor than AD for determining SA compositions.
Collapse
Affiliation(s)
- Pooja Tripathi
- Department of Applied Biosciences, Kyungpook National University, Daegu 41566, Korea; (P.T.); (J.S.A.)
| | - Jamila S. Abdullah
- Department of Applied Biosciences, Kyungpook National University, Daegu 41566, Korea; (P.T.); (J.S.A.)
| | - Jaeyoung Kim
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Korea; (J.K.); (Y.-S.C.)
| | - Yong-Suk Chung
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Korea; (J.K.); (Y.-S.C.)
| | - Seong-Hoon Kim
- National Agrobiodiversity Center, National Institute of Agricultural Sciences, RDA, Jeonju 54874, Korea;
| | - Muhammad Hamayun
- Department of Botany, Abdul Wali Khan University, Mardan 23200, Pakistan;
| | - Yoonha Kim
- Department of Applied Biosciences, Kyungpook National University, Daegu 41566, Korea; (P.T.); (J.S.A.)
- Correspondence: ; Tel.: +82-53-950-5710
| |
Collapse
|
10
|
Wheat Yield Prediction Based on Unmanned Aerial Vehicles-Collected Red–Green–Blue Imagery. REMOTE SENSING 2021. [DOI: 10.3390/rs13152937] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
Unmanned aerial vehicles-collected (UAVs) digital red–green–blue (RGB) images provided a cost-effective method for precision agriculture applications regarding yield prediction. This study aims to fully explore the potential of UAV-collected RGB images in yield prediction of winter wheat by comparing it to multi-source observations, including thermal, structure, volumetric metrics, and ground-observed leaf area index (LAI) and chlorophyll content under the same level or across different levels of nitrogen fertilization. Color indices are vegetation indices calculated by the vegetation reflectance at visible bands (i.e., red, green, and blue) derived from RGB images. The results showed that some of the color indices collected at the jointing, flowering, and early maturity stages had high correlation (R2 = 0.76–0.93) with wheat grain yield. They gave the highest prediction power (R2 = 0.92–0.93) under four levels of nitrogen fertilization at the flowering stage. In contrast, the other measurements including canopy temperature, volumetric metrics, and ground-observed chlorophyll content showed lower correlation (R2 = 0.52–0.85) to grain yield. In addition, thermal information as well as volumetric metrics generally had little contribution to the improvement of grain yield prediction when combining them with color indices derived from digital images. Especially, LAI had inferior performance to color indices in grain yield prediction within the same level of nitrogen fertilization at the flowering stage (R2 = 0.00–0.40 and R2 = 0.55–0.68), and color indices provided slightly better prediction of yield than LAI at the flowering stage (R2 = 0.93, RMSE = 32.18 g/m2 and R2 = 0.89, RMSE = 39.82 g/m2) under all levels of nitrogen fertilization. This study highlights the capabilities of color indices in wheat yield prediction across genotypes, which also indicates the potential of precision agriculture application using many other flexible, affordable, and easy-to-handle devices such as mobile phones and near surface digital cameras in the future.
Collapse
|
11
|
Xu Y, Sun Z, Xue X, Gu W, Peng B. A hybrid algorithm based on MOSFLA and GA for multi-UAVs plant protection task assignment and sequencing optimization. Appl Soft Comput 2020. [DOI: 10.1016/j.asoc.2020.106623] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
12
|
Monitoring of Chestnut Trees Using Machine Learning Techniques Applied to UAV-Based Multispectral Data. REMOTE SENSING 2020. [DOI: 10.3390/rs12183032] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Phytosanitary conditions can hamper the normal development of trees and significantly impact their yield. The phytosanitary condition of chestnut stands is usually evaluated by sampling trees followed by a statistical extrapolation process, making it a challenging task, as it is labor-intensive and requires skill. In this study, a novel methodology that enables multi-temporal analysis of chestnut stands using multispectral imagery acquired from unmanned aerial vehicles is presented. Data were collected in different flight campaigns along with field surveys to identify the phytosanitary issues affecting each tree. A random forest classifier was trained with sections of each tree crown using vegetation indices and spectral bands. These were first categorized into two classes: (i) absence or (ii) presence of phytosanitary issues. Subsequently, the class with phytosanitary issues was used to identify and classify either biotic or abiotic factors. The comparison between the classification results, obtained by the presented methodology, with ground-truth data, allowed us to conclude that phytosanitary problems were detected with an accuracy rate between 86% and 91%. As for determining the specific phytosanitary issue, rates between 80% and 85% were achieved. Higher accuracy rates were attained in the last flight campaigns, the stage when symptoms are more prevalent. The proposed methodology proved to be effective in automatically detecting and classifying phytosanitary issues in chestnut trees throughout the growing season. Moreover, it is also able to identify decline or expansion situations. It may be of help as part of decision support systems that further improve on the efficient and sustainable management practices of chestnut stands.
Collapse
|
13
|
Assessing the Effect of Real Spatial Resolution of In Situ UAV Multispectral Images on Seedling Rapeseed Growth Monitoring. REMOTE SENSING 2020. [DOI: 10.3390/rs12071207] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The spatial resolution of in situ unmanned aerial vehicle (UAV) multispectral images has a crucial effect on crop growth monitoring and image acquisition efficiency. However, existing studies about optimal spatial resolution for crop monitoring are mainly based on resampled images. Therefore, the resampled spatial resolution in these studies might not be applicable to in situ UAV images. In order to obtain optimal spatial resolution of in situ UAV multispectral images for crop growth monitoring, a RedEdge Micasense 3 camera was installed onto a DJI M600 UAV flying at different heights of 22, 29, 44, 88, and 176m to capture images of seedling rapeseed with ground sampling distances (GSD) of 1.35, 1.69, 2.61, 5.73, and 11.61 cm, respectively. Meanwhile, the normalized difference vegetation index (NDVI) measured by a GreenSeeker (GS-NDVI) and leaf area index (LAI) were collected to evaluate the performance of nine vegetation indices (VIs) and VI*plant height (PH) at different GSDs for rapeseed growth monitoring. The results showed that the normalized difference red edge index (NDRE) had a better performance for estimating GS-NDVI (R2 = 0.812) and LAI (R2 = 0.717), compared with other VIs. Moreover, when GSD was less than 2.61 cm, the NDRE*PH derived from in situ UAV images outperformed the NDRE for LAI estimation (R2 = 0.757). At oversized GSD (≥5.73 cm), imprecise PH information and a large heterogeneity within the pixel (revealed by semi-variogram analysis) resulted in a large random error for LAI estimation by NDRE*PH. Furthermore, the image collection and processing time at 1.35 cm GSD was about three times as long as that at 2.61 cm. The result of this study suggested that NDRE*PH from UAV multispectral images with a spatial resolution around 2.61 cm could be a preferential selection for seedling rapeseed growth monitoring, while NDRE alone might have a better performance for low spatial resolution images.
Collapse
|
14
|
Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application. REMOTE SENSING 2020. [DOI: 10.3390/rs12060998] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Utilization of remote sensing is a new wave of modern agriculture that accelerates plant breeding and research, and the performance of farming practices and farm management. High-throughput phenotyping is a key advanced agricultural technology and has been rapidly adopted in plant research. However, technology adoption is not easy due to cost limitations in academia. This article reviews various commercial unmanned aerial vehicle (UAV) platforms as a high-throughput phenotyping technology for plant breeding. It compares known commercial UAV platforms that are cost-effective and manageable in field settings and demonstrates a general workflow for high-throughput phenotyping, including data analysis. The authors expect this article to create opportunities for academics to access new technologies and utilize the information for their research and breeding programs in more workable ways.
Collapse
|
15
|
Autonomous Mobile Ground Control Point Improves Accuracy of Agricultural Remote Sensing through Collaboration with UAV. INVENTIONS 2020. [DOI: 10.3390/inventions5010012] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
Ground control points (GCPs) are critical for agricultural remote sensing that require georeferencing and calibration of images collected from an unmanned aerial vehicles (UAV) at different times. However, the conventional stationary GCPs are time-consuming and labor-intensive to measure, distribute, and collect their information in a large field setup. An autonomous mobile GCP and a collaboration strategy to communicate with the UAV were developed to improve the efficiency and accuracy of the UAV-based data collection process. Prior to actual field testing, preliminary tests were conducted using the system to show the capability of automatic path tracking by reducing the root mean square error (RMSE) for lateral deviation from 34.3 cm to 15.6 cm based on the proposed look-ahead tracking method. The tests also indicated the feasibility of moving reflectance reference panels successively along all the waypoints without having detrimental effects on pixel values in the mosaicked images, with the percentage errors in digital number values ranging from −1.1% to 0.1%. In the actual field testing, the autonomous mobile GCP was able to successfully cooperate with the UAV in real-time without any interruption, showing superior performances for georeferencing, radiometric calibration, height calibration, and temperature calibration, compared to the conventional calibration method that has stationary GCPs.
Collapse
|
16
|
Estimating the Leaf Area Index of Winter Wheat Based on Unmanned Aerial Vehicle RGB-Image Parameters. SUSTAINABILITY 2019. [DOI: 10.3390/su11236829] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The leaf area index (LAI) is not only an important parameter for monitoring crop growth, but also an important input parameter for crop yield prediction models and hydrological and climatic models. Several studies have recently been conducted to estimate crop LAI using unmanned aerial vehicle (UAV) multispectral and hyperspectral data. However, there are few studies on estimating the LAI of winter wheat using unmanned aerial vehicle (UAV) RGB images. In this study, we estimated the LAI of winter wheat at the jointing stage on simple farmland in Xinjiang, China, using parameters derived from UAV RGB images. According to gray correlation analysis, UAV RGB-image parameters such as the Visible Atmospherically Resistant Index (VARI), the Red Green Blue Vegetation Index (RGBVI), the Digital Number (DN) of Blue Channel (B) and the Green Leaf Algorithm (GLA) were selected to develop models for estimating the LAI of winter wheat. The results showed that it is feasible to use UAV RGB images for inverting and mapping the LAI of winter wheat at the jointing stage on the field scale, and the partial least squares regression (PLSR) model based on the VARI, RGBVI, B and GLA had the best prediction accuracy (R2 = 0.776, root mean square error (RMSE) = 0.468, residual prediction deviation (RPD) = 1.838) among all the regression models. To conclude, UAV RGB images not only have great potential in estimating the LAI of winter wheat, but also can provide more reliable and accurate data for precision agriculture management.
Collapse
|
17
|
Kim DW, Min TS, Kim Y, Silva RR, Hyun HN, Kim JS, Kim KH, Kim HJ, Chung YS. Sustainable Agriculture by Increasing Nitrogen Fertilizer Efficiency Using Low-Resolution Camera Mounted on Unmanned Aerial Vehicles. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2019; 16:ijerph16203893. [PMID: 31615109 PMCID: PMC6843287 DOI: 10.3390/ijerph16203893] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/24/2019] [Revised: 10/07/2019] [Accepted: 10/09/2019] [Indexed: 11/23/2022]
Abstract
Nitrogen use efficiency in modern agriculture is very low. It means that a lot of synthetic chemicals are wasted rather than utilized by crops. This can cause more problems where the soil surface is thin and rocky like Jeju Island in the Republic of Korea. This is because overly used nitrogen fertilizer can be washed into the underground water and pollute it. Thus, it would be important to monitor the nitrogen deficiency of crops in the field to provide the right amount of nitrogen in a timely manner so that nitrogen waste can be limited. To achieve this, the normalized difference vegetation index (NDVI) was used to monitor chlorophyll content, which is tightly associated with nitrogen content in the buckwheat field. The NDVI was calculated with the data obtained by a low-resolution camera mounted on an unmanned aerial vehicle. The results showed that the NDVI can estimate the chlorophyll content of buckwheat. These simple but clear results imply that precision agriculture could be achieved even with a low-resolution camera in a cost-effective manner to reduce the pollution of underground water.
Collapse
Affiliation(s)
- Dong-Wook Kim
- Department of Biosystems & Biomaterials Science and Engineering, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea.
| | - Tae-Sun Min
- Department of Animal Biotechnology, Jeju National University, Jeju 63243, Korea.
| | - Yoonha Kim
- Plant Bioscience, School of Applied Biosciences, Kyungpook National University, Daegu 41566, Korea.
| | - Renato Rodrigues Silva
- Institute of Mathematics and Statistics, Federal University of Goias, Goiania 74001-970, Brazil.
| | - Hae-Nam Hyun
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Korea.
| | - Ju-Sung Kim
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Korea.
| | - Kyung-Hwan Kim
- National Institute of Agricultural Sciences, Rural Development Administration (RDA), Jeonju 54874, Korea.
| | - Hak-Jin Kim
- Department of Biosystems & Biomaterials Science and Engineering, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea.
| | - Yong Suk Chung
- Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Korea.
| |
Collapse
|
18
|
Multifunctional Ground Control Points with a Wireless Network for Communication with a UAV. SENSORS 2019; 19:s19132852. [PMID: 31252556 PMCID: PMC6651049 DOI: 10.3390/s19132852] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/28/2019] [Revised: 06/21/2019] [Accepted: 06/25/2019] [Indexed: 11/16/2022]
Abstract
Ground control points (GCPs) are commonly used for georeferencing in remote sensing. Precise position measurement of the GCPs typically requires careful ground surveying, which is time-consuming and labor-intensive and thus excessively costly if it needs to be repeated multiple times in a season. A system of multifunctional GCPs and a wireless network for communication with an unmanned aerial vehicle (UAV) was developed to improve the speed of GCP setup and provide GCP data collection in real-time during the flight. While testing the system, a single-board computer on a fixed-wing UAV used in the study successfully recorded position data from all the GCPs during the flight. The multifunctional GCPs were also tested for use as references for calibration of reflectance and height for field objects like crops. The test of radiometric calibration resulted in an average reflectance error of 2.0% and a strong relationship (R2 = 0.99) between UAV-based estimates and ground reflectance. Furthermore, the average height difference between UAV-based height estimates and ground measurements was within 10 cm.
Collapse
|
19
|
Estimation of Ground Surface and Accuracy Assessments of Growth Parameters for a Sweet Potato Community in Ridge Cultivation. REMOTE SENSING 2019. [DOI: 10.3390/rs11121487] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
There are only a few studies that have been made on accuracy assessments of Leaf Area Index (LAI) and biomass estimation using three-dimensional (3D) models generated by structure from motion (SfM) image processing. In this study, sweet potato was grown with different amounts of nitrogen fertilization in ridge cultivation at an experimental farm. Three-dimensional dense point cloud models were constructed from a series of two-dimensional (2D) color images measured by a small unmanned aerial vehicle (UAV) paired with SfM image processing. Although it was in the early stage of cultivation, a complex ground surface model for ridge cultivation with vegetation was generated, and the uneven ground surface could be estimated with an accuracy of 1.4 cm. Furthermore, in order to accurately estimate growth parameters from the early growth to the harvest period, a 3D model was constructed using a root mean square error (RMSE) of 3.3 cm for plant height estimation. By using a color index, voxel models were generated and LAIs were estimated using a regression model with an RMSE accuracy of 0.123. Further, regression models were used to estimate above-ground and below-ground biomass, or tuberous root weights, based on estimated LAIs.
Collapse
|
20
|
Lu N, Zhou J, Han Z, Li D, Cao Q, Yao X, Tian Y, Zhu Y, Cao W, Cheng T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. PLANT METHODS 2019; 15:17. [PMID: 30828356 PMCID: PMC6381699 DOI: 10.1186/s13007-019-0402-3] [Citation(s) in RCA: 49] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2018] [Accepted: 02/13/2019] [Indexed: 05/18/2023]
Abstract
BACKGROUND Aboveground biomass (AGB) is a widely used agronomic parameter for characterizing crop growth status and predicting grain yield. The rapid and accurate estimation of AGB in a non-destructive way is useful for making informed decisions on precision crop management. Previous studies have investigated vegetation indices (VIs) and canopy height metrics derived from Unmanned Aerial Vehicle (UAV) data to estimate the AGB of various crops. However, the input variables were derived either from one type of data or from different sensors on board UAVs. Whether the combination of VIs and canopy height metrics derived from a single low-cost UAV system can improve the AGB estimation accuracy remains unclear. This study used a low-cost UAV system to acquire imagery at 30 m flight altitude at critical growth stages of wheat in Rugao of eastern China. The experiments were conducted in 2016 and 2017 and involved 36 field plots representing variations in cultivar, nitrogen fertilization level and sowing density. We evaluated the performance of VIs, canopy height metrics and their combination for AGB estimation in wheat with the stepwise multiple linear regression (SMLR) and three types of machine learning algorithms (support vector regression, SVR; extreme learning machine, ELM; random forest, RF). RESULTS Our results demonstrated that the combination of VIs and canopy height metrics improved the estimation accuracy for AGB of wheat over the use of VIs or canopy height metrics alone. Specifically, RF performed the best among the SMLR and three machine learning algorithms regardless of using all the original variables or selected variables by the SMLR. The best accuracy (R 2 = 0.78, RMSE = 1.34 t/ha, rRMSE = 28.98%) was obtained when applying RF to the combination of VIs and canopy height metrics. CONCLUSIONS Our findings implied that an inexpensive approach consisting of the RF algorithm and the combination of RGB imagery and point cloud data derived from a low-cost UAV system at the consumer-grade level can be used to improve the accuracy of AGB estimation and have potential in the practical applications in the rapid estimation of other growth parameters.
Collapse
Affiliation(s)
- Ning Lu
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Jie Zhou
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Zixu Han
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Dong Li
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Qiang Cao
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Xia Yao
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Yongchao Tian
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Yan Zhu
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Weixing Cao
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| | - Tao Cheng
- National Engineering and Technology Center for Information Agriculture (NETCIA), Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, One Weigang, Nanjing, 210095 Jiangsu China
| |
Collapse
|
21
|
Abstract
Accurate 3D reconstruction/modelling from unmanned aerial vehicle (UAV)-based imagery has become the key prerequisite in various applications. Although current commercial software has automated the process of image-based reconstruction, a transparent system, which can be incorporated with different user-defined constraints, is still preferred by the photogrammetric research community. In this regard, this paper presents a transparent framework for the automated aerial triangulation of UAV images. The proposed framework is conducted in three steps. In the first step, two approaches, which take advantage of prior information regarding the flight trajectory, are implemented for reliable relative orientation recovery. Then, initial recovery of image exterior orientation parameters (EOPs) is achieved through either an incremental or global approach. Finally, a global bundle adjustment involving Ground Control Points (GCPs) and check points is carried out to refine all estimated parameters in the defined mapping coordinate system. Four real image datasets, which are acquired by two different UAV platforms, have been utilized to evaluate the feasibility of the proposed framework. In addition, a comparative analysis between the proposed framework and the existing commercial software is performed. The derived experimental results demonstrate the superior performance of the proposed framework in providing an accurate 3D model, especially when dealing with acquired UAV images containing repetitive pattern and significant image distortions.
Collapse
|
22
|
Abstract
This study aimed to characterize vineyard vegetation thorough multi-temporal monitoring using a commercial low-cost rotary-wing unmanned aerial vehicle (UAV) equipped with a consumer-grade red/green/blue (RGB) sensor. Ground-truth data and UAV-based imagery were acquired on nine distinct dates, covering the most significant vegetative growing cycle until harvesting season, over two selected vineyard plots. The acquired UAV-based imagery underwent photogrammetric processing resulting, per flight, in an orthophoto mosaic, used for vegetation estimation. Digital elevation models were used to compute crop surface models. By filtering vegetation within a given height-range, it was possible to separate grapevine vegetation from other vegetation present in a specific vineyard plot, enabling the estimation of grapevine area and volume. The results showed high accuracy in grapevine detection (94.40%) and low error in grapevine volume estimation (root mean square error of 0.13 m and correlation coefficient of 0.78 for height estimation). The accuracy assessment showed that the proposed method based on UAV-based RGB imagery is effective and has potential to become an operational technique. The proposed method also allows the estimation of grapevine areas that can potentially benefit from canopy management operations.
Collapse
|
23
|
Application of UAV Photogrammetric System for Monitoring Ancient Tree Communities in Beijing. FORESTS 2018. [DOI: 10.3390/f9120735] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Ancient tree community surveys have great scientific value to the study of biological resources, plant distribution, environmental change, genetic characteristics of species, and historical and cultural heritage. The largest ancient pear tree communities in China, which are rare, are located in the Daxing District of Beijing. However, the environmental conditions are tough, and the distribution is relatively dispersed. Therefore, a low-cost, high-efficiency, and high-precision measuring system is urgently needed to complete the survey of ancient tree communities. By unmanned aerial vehicle (UAV) photogrammetric program research, ancient tree information extraction method research, and ancient tree diameter at breast height (DBH) and age prediction model research, the proposed method can realize the measurement of tree height, crown width, and prediction of DBH and tree age with low cost, high efficiency, and high precision. Through experiments and analysis, the root mean square error (RMSE) of the tree height measurement was 0.1814 m, the RMSE of the crown width measurement was 0.3292 m, the RMSE of the DBH prediction was 3.0039 cm, and the RMSE of the tree age prediction was 4.3753 years, which could meet the needs of ancient tree survey of the Daxing District Gardening and Greening Bureau. Therefore, a UAV photogrammetric measurement system proved to be capable when applied in the survey of ancient tree communities and even in partial forest inventories.
Collapse
|
24
|
How Far Can Consumer-Grade UAV RGB Imagery Describe Crop Production? A 3D and Multitemporal Modeling Approach Applied to Zea mays. REMOTE SENSING 2018. [DOI: 10.3390/rs10111798] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
In recent decades, remote sensing has increasingly been used to estimate the spatio-temporal evolution of crop biophysical parameters such as the above-ground biomass (AGB). On a local scale, the advent of unmanned aerial vehicles (UAVs) seems to be a promising trade-off between satellite/airborne and terrestrial remote sensing. This study aims to evaluate the potential of a low-cost UAV RGB solution to predict the final AGB of Zea mays. Besides evaluating the interest of 3D data and multitemporality, our study aims to answer operational questions such as when one should plan a combination of two UAV flights for AGB modeling. In this case, study, final AGB prediction model performance reached 0.55 (R-square) using only UAV information and 0.8 (R-square) when combining UAV information from a single flight with a single-field AGB measurement. The adding of UAV height information to the model improves the quality of the AGB prediction. Performing two flights provides almost systematically an improvement in AGB prediction ability in comparison to most single flights. Our study provides clear insight about how we can counter the low spectral resolution of consumer-grade RGB cameras using height information and multitemporality. Our results highlight the importance of the height information which can be derived from UAV data on one hand, and on the other hand, the lower relative importance of RGB spectral information.
Collapse
|