1
|
Kvile KØ, Gundersen H, Poulsen RN, Sample JE, Salberg AB, Ghareeb ME, Buls T, Bekkby T, Hancke K. Drone and ground-truth data collection, image annotation and machine learning: A protocol for coastal habitat mapping and classification. MethodsX 2024; 13:102935. [PMID: 39295629 PMCID: PMC11409010 DOI: 10.1016/j.mex.2024.102935] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2024] [Accepted: 08/27/2024] [Indexed: 09/21/2024] Open
Abstract
Aerial drone imaging is an efficient tool for mapping and monitoring of coastal habitats at high spatial and temporal resolution. Specifically, drone imaging allows for time- and cost-efficient mapping covering larger areas than traditional mapping and monitoring techniques, while also providing more detailed information than those from airplanes and satellites, enabling for example to differentiate various types of coastal vegetation. Here, we present a systematic method for shallow water habitat classification based on drone imagery. The method includes:•Collection of drone images and creation of orthomosaics.•Gathering ground-truth data in the field to guide the image annotation and to validate the final map product.•Annotation of drone images into - potentially hierarchical - habitat classes and training of machine learning algorithms for habitat classification.As a case study, we present a field campaign that employed these methods to map a coastal site dominated by seagrass, seaweed and kelp, in addition to sediments and rock. Such detailed but efficient mapping and classification can aid to understand and sustainably manage ecologically and valuable marine ecosystems.
Collapse
Affiliation(s)
- Kristina Øie Kvile
- Norwegian Institute for Water Research (NIVA), Økernveien 94, 0579 Oslo, Norway
| | - Hege Gundersen
- Norwegian Institute for Water Research (NIVA), Økernveien 94, 0579 Oslo, Norway
| | | | - James Edward Sample
- Norwegian Institute for Water Research (NIVA), Økernveien 94, 0579 Oslo, Norway
| | | | - Medyan Esam Ghareeb
- Norwegian Institute for Water Research (NIVA), Økernveien 94, 0579 Oslo, Norway
| | - Toms Buls
- SpectroFly ApS, Markstien 2, 4640 Faxe, Denmark
| | - Trine Bekkby
- Norwegian Institute for Water Research (NIVA), Økernveien 94, 0579 Oslo, Norway
| | - Kasper Hancke
- Norwegian Institute for Water Research (NIVA), Økernveien 94, 0579 Oslo, Norway
| |
Collapse
|
2
|
Liu Y, Ban S, Wei S, Li L, Tian M, Hu D, Liu W, Yuan T. Estimating the frost damage index in lettuce using UAV-based RGB and multispectral images. FRONTIERS IN PLANT SCIENCE 2024; 14:1242948. [PMID: 38239223 PMCID: PMC10794741 DOI: 10.3389/fpls.2023.1242948] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/20/2023] [Accepted: 12/11/2023] [Indexed: 01/22/2024]
Abstract
Introduction The cold stress is one of the most important factors for affecting production throughout year, so effectively evaluating frost damage is great significant to the determination of the frost tolerance in lettuce. Methods We proposed a high-throughput method to estimate lettuce FDI based on remote sensing. Red-Green-Blue (RGB) and multispectral images of open-field lettuce suffered from frost damage were captured by Unmanned Aerial Vehicle platform. Pearson correlation analysis was employed to select FDI-sensitive features from RGB and multispectral images. Then the models were established for different FDI-sensitive features based on sensor types and different groups according to lettuce colors using multiple linear regression, support vector machine and neural network algorithms, respectively. Results and discussion Digital number of blue and red channels, spectral reflectance at blue, red and near-infrared bands as well as six vegetation indexes (VIs) were found to be significantly related to the FDI of all lettuce groups. The high sensitivity of four modified VIs to frost damage of all lettuce groups was confirmed. The average accuracy of models were improved by 3% to 14% through a combination of multisource features. Color of lettuce had a certain impact on the monitoring of frost damage by FDI prediction models, because the accuracy of models based on green lettuce group were generally higher. The MULTISURCE-GREEN-NN model with R2 of 0.715 and RMSE of 0.014 had the best performance, providing a high-throughput and efficient technical tool for frost damage investigation which will assist the identification of cold-resistant green lettuce germplasm and related breeding.
Collapse
Affiliation(s)
- Yiwen Liu
- College of Information Technology, Shanghai Ocean University, Shanghai, China
- Institute of Agricultural Science and Technology Information, Shanghai Academy of Agricultural Sciences, Shanghai, China
- Key Laboratory of Intelligent Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Shanghai, China
| | - Songtao Ban
- Institute of Agricultural Science and Technology Information, Shanghai Academy of Agricultural Sciences, Shanghai, China
- Key Laboratory of Intelligent Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Shanghai, China
| | - Shiwei Wei
- Jinshan Experimental Station, Shanghai Agrobiological Gene Center, Shanghai, China
| | - Linyi Li
- College of Information Technology, Shanghai Ocean University, Shanghai, China
- Institute of Agricultural Science and Technology Information, Shanghai Academy of Agricultural Sciences, Shanghai, China
- Key Laboratory of Intelligent Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Shanghai, China
| | - Minglu Tian
- Institute of Agricultural Science and Technology Information, Shanghai Academy of Agricultural Sciences, Shanghai, China
- Key Laboratory of Intelligent Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Shanghai, China
| | - Dong Hu
- Institute of Agricultural Science and Technology Information, Shanghai Academy of Agricultural Sciences, Shanghai, China
- Key Laboratory of Intelligent Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Shanghai, China
| | - Weizhen Liu
- School of Computer and Artificial Intelligence, Wuhan University of Technology, Wuhan, China
| | - Tao Yuan
- Institute of Agricultural Science and Technology Information, Shanghai Academy of Agricultural Sciences, Shanghai, China
- Key Laboratory of Intelligent Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Shanghai, China
| |
Collapse
|
3
|
De Silva M, Brown D. Multispectral Plant Disease Detection with Vision Transformer-Convolutional Neural Network Hybrid Approaches. SENSORS (BASEL, SWITZERLAND) 2023; 23:8531. [PMID: 37896623 PMCID: PMC10611079 DOI: 10.3390/s23208531] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/08/2023] [Revised: 10/14/2023] [Accepted: 10/16/2023] [Indexed: 10/29/2023]
Abstract
Plant diseases pose a critical threat to global agricultural productivity, demanding timely detection for effective crop yield management. Traditional methods for disease identification are laborious and require specialised expertise. Leveraging cutting-edge deep learning algorithms, this study explores innovative approaches to plant disease identification, combining Convolutional Neural Networks (CNNs) and Vision Transformers (ViTs) to enhance accuracy. A multispectral dataset was meticulously collected to facilitate this research using six 50 mm filter filters, covering both the visible and several near-infrared (NIR) wavelengths. Among the models employed, ViT-B16 notably achieved the highest test accuracy, precision, recall, and F1 score across all filters, with averages of 83.3%, 90.1%, 90.75%, and 89.5%, respectively. Furthermore, a comparative analysis highlights the pivotal role of balanced datasets in selecting the appropriate wavelength and deep learning model for robust disease identification. These findings promise to advance crop disease management in real-world agricultural applications and contribute to global food security. The study underscores the significance of machine learning in transforming plant disease diagnostics and encourages further research in this field.
Collapse
|
4
|
Prystay T, Adams G, Favaro B, Gregory R, Le Bris A. The reproducibility of remotely piloted aircraft systems to monitor seasonal variation in submerged seagrass and estuarine habitats. Facets (Ott) 2023. [DOI: 10.1139/facets-2022-0149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2023] Open
Abstract
Seasonal variation in seagrass growth and senescence affects the provision of ecosystem services and restoration efforts, requiring seasonal monitoring. Remotely piloted aircraft systems (RPAS) enable frequent high-resolution surveys at full-meadow scales. However, the reproducibility of RPAS surveys is challenged by varying environmental conditions, which are common in temperate estuarine systems. We surveyed three eelgrass ( Zostera marina) meadows in Newfoundland, Canada, using an RPAS equipped with a three-color band (red, green, blue [RGB]) camera, to evaluate the seasonal reproducibility of RPAS surveys and assess the effects of flight altitude (30–115 m) on classification accuracy. Habitat percent cover was estimated using supervised image classification and compared to corresponding estimates from snorkel quadrat surveys. Our results revealed inconsistent misclassification due to environmental variability and low spectral separability between habitats. This rendered differentiating between model misclassification versus actual changes in seagrass cover infeasible. Conflicting estimates in seagrass and macroalgae percent cover compared to snorkel estimates could not be corrected by decreasing the RPAS altitude. Instead, higher altitude surveys may be worth the trade-off of lower image resolution to avoid environmental conditions shifting mid-survey. We conclude that RPAS surveys using RGB imagery alone may be insufficient to discriminate seasonal changes in estuarine subtidal vegetated habitats.
Collapse
Affiliation(s)
- T.S. Prystay
- Centre for Fisheries Ecosystems Research, Fisheries and Marine Institute, Memorial University of Newfoundland, St. John’s, NL A1C 5R3, Canada
| | - G. Adams
- Centre for Fisheries Ecosystems Research, Fisheries and Marine Institute, Memorial University of Newfoundland, St. John’s, NL A1C 5R3, Canada
| | - B. Favaro
- Faculty of Science and Horticulture, Kwantlen Polytechnic University, Surrey, BC V3W 2M8, Canada
| | - R.S. Gregory
- Fisheries and Oceans Canada, Ecological Sciences Section, Northwest Atlantic Fisheries Centre, St. John’s, NL A1C 5X1, Canada
| | - A. Le Bris
- Centre for Fisheries Ecosystems Research, Fisheries and Marine Institute, Memorial University of Newfoundland, St. John’s, NL A1C 5R3, Canada
| |
Collapse
|
5
|
Jackson J, Lawson CS, Adelmant C, Huhtala E, Fernandes P, Hodgson R, King H, Williamson L, Maseyk K, Hawes N, Hector A, Salguero‐Gómez R. Short-range multispectral imaging is an inexpensive, fast, and accurate approach to estimate biodiversity in a temperate calcareous grassland. Ecol Evol 2022; 12:e9623. [PMID: 36532135 PMCID: PMC9750811 DOI: 10.1002/ece3.9623] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Accepted: 11/20/2022] [Indexed: 12/16/2022] Open
Abstract
Image sensing technologies are rapidly increasing the cost-effectiveness of biodiversity monitoring efforts. Species differences in the reflectance of electromagnetic radiation can be used as a surrogate estimate plant biodiversity using multispectral image data. However, these efforts are often hampered by logistical difficulties in broad-scale implementation. Here, we investigate the utility of multispectral imaging technology from commercially available unmanned aerial vehicles (UAVs, or drones) in estimating biodiversity metrics at a fine spatial resolution (0.1-0.5 cm pixel resolution) in a temperate calcareous grassland in Oxfordshire, UK. We calculate a suite of moments (coefficient of variation, standard deviation, skewness, and kurtosis) for the distribution of radiance from multispectral images at five wavelength bands (Blue 450 ± 16 nm; Green 560 ± 16 nm; Red 650 ± 16 nm; Red Edge 730 ± 16 nm; Near Infrared 840 ± 16 nm) and test their effectiveness at estimating ground-truthed biodiversity metrics from in situ botanical surveys for 37-1 × 1 m quadrats. We find positive associations between the average coefficient of variation in spectral radiance and both the Shannon-Weiner and Simpson's biodiversity indices. Furthermore, the average coefficient of variation in spectral radiance is consistent and highly repeatable across sampling days and recording heights. Positive associations with biodiversity indices hold irrespective of the image recording height (2-8 m), but we report reductions in estimates of spectral diversity with increases to UAV recording height. UAV imaging reduced sampling time by a factor of 16 relative to in situ botanical surveys. We demonstrate the utility of multispectral radiance moments as an indicator of biodiversity in this temperate calcareous grassland at a fine spatial resolution using a widely available UAV monitoring system with a coarse spectral resolution. The use of UAV technology with multispectral sensors has far-reaching potential to provide cost-effective and high-resolution monitoring of biodiversity.
Collapse
Affiliation(s)
- John Jackson
- Department of BiosciencesUniversity of SheffieldSheffieldUK
| | - Clare S. Lawson
- School of Environment, Earth & Ecosystem SciencesThe Open UniversityMilton KeynesUK
| | | | - Evie Huhtala
- Department of BiosciencesUniversity of SheffieldSheffieldUK
| | | | - Rose Hodgson
- Department of BiosciencesUniversity of SheffieldSheffieldUK
| | - Hannah King
- Department of BiosciencesUniversity of SheffieldSheffieldUK
| | | | - Kadmiel Maseyk
- School of Environment, Earth & Ecosystem SciencesThe Open UniversityMilton KeynesUK
| | - Nick Hawes
- Department of Engineering Science, Oxford Robotics InstituteUniversity of OxfordOxfordUK
| | - Andrew Hector
- Department of BiosciencesUniversity of SheffieldSheffieldUK
| | - Rob Salguero‐Gómez
- Department of BiologyUniversity of OxfordOxfordUK
- Max Planck Institute for Demographic ResearchRostockGermany
| |
Collapse
|
6
|
Multisensor UAS mapping of Plant Species and Plant Functional Types in Midwestern Grasslands. REMOTE SENSING 2022. [DOI: 10.3390/rs14143453] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Uncrewed aerial systems (UASs) have emerged as powerful ecological observation platforms capable of filling critical spatial and spectral observation gaps in plant physiological and phenological traits that have been difficult to measure from space-borne sensors. Despite recent technological advances, the high cost of drone-borne sensors limits the widespread application of UAS technology across scientific disciplines. Here, we evaluate the tradeoffs between off-the-shelf and sophisticated drone-borne sensors for mapping plant species and plant functional types (PFTs) within a diverse grassland. Specifically, we compared species and PFT mapping accuracies derived from hyperspectral, multispectral, and RGB imagery fused with light detection and ranging (LiDAR) or structure-for-motion (SfM)-derived canopy height models (CHM). Sensor–data fusion were used to consider either a single observation period or near-monthly observation frequencies for integration of phenological information (i.e., phenometrics). Results indicate that overall classification accuracies for plant species and PFTs were highest in hyperspectral and LiDAR-CHM fusions (78 and 89%, respectively), followed by multispectral and phenometric–SfM–CHM fusions (52 and 60%, respectively) and RGB and SfM–CHM fusions (45 and 47%, respectively). Our findings demonstrate clear tradeoffs in mapping accuracies from economical versus exorbitant sensor networks but highlight that off-the-shelf multispectral sensors may achieve accuracies comparable to those of sophisticated UAS sensors by integrating phenometrics into machine learning image classifiers.
Collapse
|
7
|
Seaweed Habitats on the Shore: Characterization through Hyperspectral UAV Imagery and Field Sampling. REMOTE SENSING 2022. [DOI: 10.3390/rs14133124] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Intertidal macroalgal habitats are major components of temperate coastal ecosystems. Their distribution was studied using field sampling and hyperspectral remote mapping on a rocky shore of Porspoder (western Brittany, France). Covers of both dominating macroalgae and the sessile fauna were characterized in situ at low tide in 24 sampling spots, according to four bathymetric levels. A zone of ca. 17,000 m2 was characterized using a drone equipped with a hyperspectral camera. Macroalgae were identified by image processing using two classification methods to assess the representativeness of spectral classes. Finally, a comparison of the remote imaging data to the field sampling data was conducted. Seven seaweed classes were distinguished by hyperspectral pictures, including five different species of Fucales. The maximum likelihood (MLC) and spectral angle mapper (SAM) were both trained using image-derived spectra. MLC was more accurate to classify the main dominating species (Overall Accuracy (OA) 95.1%) than SAM (OA 87.9%) at a site scale. However, at sampling points scale, the results depend on the bathymetric level. This study evidenced the efficiency and accuracy of hyperspectral remote sensing to evaluate the distribution of dominating intertidal seaweed species and the potential for a combined field/remote approach to assess the ecological state of macroalgal communities.
Collapse
|
8
|
Yang Q, She B, Huang L, Yang Y, Zhang G, Zhang M, Hong Q, Zhang D. Extraction of soybean planting area based on feature fusion technology of multi-source low altitude unmanned aerial vehicle images. ECOL INFORM 2022. [DOI: 10.1016/j.ecoinf.2022.101715] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
9
|
Fusion of Drone-Based RGB and Multi-Spectral Imagery for Shallow Water Bathymetry Inversion. REMOTE SENSING 2022. [DOI: 10.3390/rs14051127] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Shallow bathymetry inversion algorithms have long been applied in various types of remote sensing imagery with relative success. However, this approach requires that imagery with increased radiometric resolution in the visible spectrum be available. The recent developments in drones and camera sensors allow for testing current inversion techniques on new types of datasets with centimeter resolution. This study explores the bathymetric mapping capabilities of fused RGB and multispectral imagery as an alternative to costly hyperspectral sensors for drones. Combining drone-based RGB and multispectral imagery into a single cube dataset provides the necessary radiometric detail for shallow bathymetry inversion applications. This technique is based on commercial and open-source software and does not require the input of reference depth measurements in contrast to other approaches. The robustness of this method was tested on three different coastal sites with contrasting seafloor types with a maximum depth of six meters. The use of suitable end-member spectra, which are representative of the seafloor types of the study area, are important parameters in model tuning. The results of this study are promising, showing good correlation (R2 > 0.75 and Lin’s coefficient > 0.80) and less than half a meter average error when they are compared with sonar depth measurements. Consequently, the integration of imagery from various drone-based sensors (visible range) assists in producing detailed bathymetry maps for small-scale shallow areas based on optical modelling.
Collapse
|
10
|
Underwater Hyperspectral Imaging (UHI): A Review of Systems and Applications for Proximal Seafloor Ecosystem Studies. REMOTE SENSING 2021. [DOI: 10.3390/rs13173451] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
Marine ecosystem monitoring requires observations of its attributes at different spatial and temporal scales that traditional sampling methods (e.g., RGB imaging, sediment cores) struggle to efficiently provide. Proximal optical sensing methods can fill this observational gap by providing observations of, and tracking changes in, the functional features of marine ecosystems non-invasively. Underwater hyperspectral imaging (UHI) employed in proximity to the seafloor has shown a further potential to monitor pigmentation in benthic and sympagic phototrophic organisms at small spatial scales (mm–cm) and for the identification of minerals and taxa through their finely resolved spectral signatures. Despite the increasing number of studies applying UHI, a review of its applications, capabilities, and challenges for seafloor ecosystem research is overdue. In this review, we first detail how the limited band availability inherent to standard underwater cameras has led to a data analysis “bottleneck” in seafloor ecosystem research, in part due to the widespread implementation of underwater imaging platforms (e.g., remotely operated vehicles, time-lapse stations, towed cameras) that can acquire large image datasets. We discuss how hyperspectral technology brings unique opportunities to address the known limitations of RGB cameras for surveying marine environments. The review concludes by comparing how different studies harness the capacities of hyperspectral imaging, the types of methods required to validate observations, and the current challenges for accurate and replicable UHI research.
Collapse
|
11
|
RGB Indices and Canopy Height Modelling for Mapping Tidal Marsh Biomass from a Small Unmanned Aerial System. REMOTE SENSING 2021. [DOI: 10.3390/rs13173406] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Coastal tidal marshes are essential ecosystems for both economic and ecological reasons. They necessitate regular monitoring as the effects of climate change begin to be manifested in changes to marsh vegetation healthiness. Small unmanned aerial systems (sUAS) build upon previously established remote sensing techniques to monitor a variety of vegetation health metrics, including biomass, with improved flexibility and affordability of data acquisition. The goal of this study was to establish the use of RGB-based vegetation indices for mapping and monitoring tidal marsh vegetation (i.e., Spartina alterniflora) biomass. Flights over tidal marsh study sites were conducted using a multi-spectral camera on a quadcopter sUAS near vegetation peak growth. A number of RGB indices were extracted to build a non-linear biomass model. A canopy height model was developed using sUAS-derived digital surface models and LiDAR-derived digital terrain models to assess its contribution to the biomass model. Results found that the distance-based RGB indices outperformed the regular radio-based indices in coastal marshes. The best-performing biomass models used the triangular greenness index (TGI; R2 = 0.39) and excess green index (ExG; R2 = 0.376). The estimated biomass revealed high biomass predictions at the fertilized marsh plots in the Long-Term Research in Environmental Biology (LTREB) project at the study site. The sUAS-extracted canopy height was not statistically significant in biomass estimation but showed similar explanatory power to other studies. Due to the lack of biomass samples in the inner estuary, the proposed biomass model in low marsh does not perform as well as the high marsh that is close to shore and accessible for biomass sampling. Further research of low marsh is required to better understand the best conditions for S. alterniflora biomass estimation using sUAS as an on-demand, personal remote sensing tool.
Collapse
|
12
|
Missing the Forest and the Trees: Utility, Limits and Caveats for Drone Imaging of Coastal Marine Ecosystems. REMOTE SENSING 2021. [DOI: 10.3390/rs13163136] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Coastal marine ecosystems are under stress, yet actionable information about the cumulative effects of human impacts has eluded ecologists. Habitat-forming seaweeds in temperate regions provide myriad irreplaceable ecosystem services, but they are increasingly at risk of local and regional extinction from extreme climatic events and the cumulative impacts of land-use change and extractive activities. Informing appropriate management strategies to reduce the impacts of stressors requires comprehensive knowledge of species diversity, abundance and distributions. Remote sensing undoubtedly provides answers, but collecting imagery at appropriate resolution and spatial extent, and then accurately and precisely validating these datasets is not straightforward. Comprehensive and long-running monitoring of rocky reefs exist globally but are often limited to a small subset of reef platforms readily accessible to in-situ studies. Key vulnerable habitat-forming seaweeds are often not well-assessed by traditional in-situ methods, nor are they well-captured by passive remote sensing by satellites. Here we describe the utility of drone-based methods for monitoring and detecting key rocky intertidal habitat types, the limitations and caveats of these methods, and suggest a standardised workflow for achieving consistent results that will fulfil the needs of managers for conservation efforts.
Collapse
|
13
|
Wang T, Liu Y, Wang M, Fan Q, Tian H, Qiao X, Li Y. Applications of UAS in Crop Biomass Monitoring: A Review. FRONTIERS IN PLANT SCIENCE 2021; 12:616689. [PMID: 33897719 PMCID: PMC8062761 DOI: 10.3389/fpls.2021.616689] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 03/18/2021] [Indexed: 06/12/2023]
Abstract
Biomass is an important indicator for evaluating crops. The rapid, accurate and nondestructive monitoring of biomass is the key to smart agriculture and precision agriculture. Traditional detection methods are based on destructive measurements. Although satellite remote sensing, manned airborne equipment, and vehicle-mounted equipment can nondestructively collect measurements, they are limited by low accuracy, poor flexibility, and high cost. As nondestructive remote sensing equipment with high precision, high flexibility, and low-cost, unmanned aerial systems (UAS) have been widely used to monitor crop biomass. In this review, UAS platforms and sensors, biomass indices, and data analysis methods are presented. The improvements of UAS in monitoring crop biomass in recent years are introduced, and multisensor fusion, multi-index fusion, the consideration of features not directly related to monitoring biomass, the adoption of advanced algorithms and the use of low-cost sensors are reviewed to highlight the potential for monitoring crop biomass with UAS. Considering the progress made to solve this type of problem, we also suggest some directions for future research. Furthermore, it is expected that the challenge of UAS promotion will be overcome in the future, which is conducive to the realization of smart agriculture and precision agriculture.
Collapse
Affiliation(s)
- Tianhai Wang
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Yadong Liu
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Minghui Wang
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Qing Fan
- College of Civil Engineering and Architecture, Guangxi University, Nanning, China
| | - Hongkun Tian
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Xi Qiao
- Guangdong Laboratory of Lingnan Modern Agriculture, Shenzhen, Genome Analysis Laboratory of the Ministry of Agriculture and Rural Area, Agricultural Genomics Institute at Shenzhen, Chinese Academy of Agricultural Sciences, Shenzhen, China
- Guangzhou Key Laboratory of Agricultural Products Quality & Safety Traceability Information Technology, Zhongkai University of Agriculture and Engineering, Guangzhou, China
| | - Yanzhou Li
- College of Mechanical Engineering, Guangxi University, Nanning, China
| |
Collapse
|
14
|
How Far Can We Classify Macroalgae Remotely? An Example Using a New Spectral Library of Species from the South West Atlantic (Argentine Patagonia). REMOTE SENSING 2020. [DOI: 10.3390/rs12233870] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Macroalgae have attracted the interest of remote sensing as targets to study coastal marine ecosystems because of their key ecological role. The goal of this paper is to analyze a new spectral library, including 28 macroalgae from the South-West Atlantic coast, in order to assess its use in hyperspectral remote sensing. The library includes species collected in the Atlantic Patagonian coast (Argentina) with representatives of brown, red, and green algae, being 22 of the species included in a spectral library for the first time. The spectra of these main groups are described, and the intraspecific variability is also assessed, considering kelp differentiated tissues and depth range, discussing them from the point of view of their effects on spectral features. A classification and an independent component analysis using the spectral range and simulated bands of two state-of-the-art drone-borne hyperspectral sensors were performed. The results show spectral features and clusters identifying further algae taxonomic groups, showing the potential applications of this spectral library for drone-based mapping of this ecological and economical asset of our coastal marine ecosystems.
Collapse
|
15
|
Application of UAV Imagery to Detect and Quantify Submerged Filamentous Algae and Rooted Macrophytes in a Non-Wadeable River. REMOTE SENSING 2020. [DOI: 10.3390/rs12203332] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Imagery from unoccupied aerial vehicles (UAVs) is useful for mapping floating and emerged primary producers, as well as single taxa of submerged primary producers in shallow, clear lakes and streams. However, there is little research on the effectiveness of UAV imagery-based detection and quantification of submerged filamentous algae and rooted macrophytes in deeper rivers using a standard red-green-blue (RGB) camera. This study provides a novel application of UAV imagery analysis for monitoring a non-wadeable river, the Klamath River in northern California, USA. River depth and solar angle during flight were analyzed to understand their effects on benthic primary producer detection. A supervised, pixel-based Random Trees classifier was utilized as a detection mechanism to estimate the percent cover of submerged filamentous algae and rooted macrophytes from aerial photos within 32 sites along the river in June and July 2019. In-situ surveys conducted via wading and snorkeling were used to validate these data. Overall accuracy was 82% for all sites and the highest overall accuracy of classified UAV images was associated with solar angles between 47.5 and 58.72° (10:04 a.m. to 11:21 a.m.). Benthic algae were detected at depths of 1.9 m underwater and submerged macrophytes were detected down to 1.2 m (river depth) via the UAV imagery in this relatively clear river (Secchi depth > 2 m). Percent cover reached a maximum of 31% for rooted macrophytes and 39% for filamentous algae within all sites. Macrophytes dominated the upstream reaches, while filamentous algae dominated the downstream reaches closer to the Pacific Ocean. In upcoming years, four proposed dam removals are expected to alter the species composition and abundance of benthic filamentous algae and rooted macrophytes, and aerial imagery provides an effective method to monitor these changes.
Collapse
|
16
|
Automated Filtering of Multibeam Water-Column Data to Detect Relative Abundance of Giant Kelp (Macrocystis pyrifera). REMOTE SENSING 2020. [DOI: 10.3390/rs12091371] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Modern multibeam echosounders can record backscatter data returned from the water above the seafloor. These water-column data can potentially be used to detect and map aquatic vegetation such as kelp, and thus contribute to improving marine habitat mapping. However, the strong sidelobe interference noise that typically contaminates water-column data is a major obstacle to the detection of targets lying close to the seabed, such as aquatic vegetation. This article presents an algorithm to filter the noise and artefacts due to interference from the sidelobes of the receive array by normalizing the slant-range signal in each ping. To evaluate the potential of the filtered data for the detection of aquatic vegetation, we acquired a comprehensive water-column dataset over a controlled experimental site. The experimental site was a transplanted patch of giant kelp (Macrocystis pyrifera) forest of known biomass and spatial configuration, obtained by harvesting several individuals from a nearby forest, measuring and weighing them, and arranging them manually on an area of seafloor previously bare. The water-column dataset was acquired with a Kongsberg EM 2040 C multibeam echosounder at several frequencies (200, 300, and 400 kHz) and pulse lengths (25, 50, and 100 μs). The data acquisition process was repeated after removing half of the plants, to simulate a thinner forest. The giant kelp plants produced evident echoes in the water-column data at all settings. The slant-range signal normalization filter greatly improved the visual quality of the data, but the filtered data may under-represent the true amount of acoustic energy in the water column. Nonetheless, the overall acoustic backscatter measured after filtering was significantly lower, by 2 to 4 dB on average, for data acquired over the thinned forest compared to the original experiment. We discuss the implications of these results for the potential use of multibeam echosounder water-column data in marine habitat mapping.
Collapse
|
17
|
A High-Resolution Global Map of Giant Kelp (Macrocystis pyrifera) Forests and Intertidal Green Algae (Ulvophyceae) with Sentinel-2 Imagery. REMOTE SENSING 2020. [DOI: 10.3390/rs12040694] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/13/2023]
Abstract
Giant kelp (Macrocystis pyrifera) is the most widely distributed kelp species on the planet, constituting one of the richest and most productive ecosystems on Earth, but detailed information on its distribution is entirely missing in some marine ecoregions, especially in the high latitudes of the Southern Hemisphere. Here, we present an algorithm based on a series of filter thresholds to detect giant kelp employing Sentinel-2 imagery. Given the overlap between the reflectances of giant kelp and intertidal green algae (Ulvophyceae), the latter are also detected on shallow rocky intertidal areas. The kelp filter algorithm was applied separately to vegetation indices, the Floating Algae Index (FAI), the Normalised Difference Vegetation Index (NDVI), and a novel formula (the Kelp Difference, KD). Training data from previously surveyed kelp forests and other coastal and ocean features were used to identify reflectance threshold values. This procedure was validated with independent field data collected with UAV imagery at a high spatial resolution and point-georeferenced sites at a low spatial resolution. When comparing UAV with Sentinel data (high-resolution validation), an average overall accuracy ≥ 0.88 and Cohen’s kappa ≥ 0.64 coefficients were found in all three indices for canopies reaching the surface with extensions greater than 1 hectare, with the KD showing the highest average kappa score (0.66). Measurements between previously surveyed georeferenced points and remotely-sensed kelp grid cells (low-resolution validation) showed that 66% of the georeferenced points had grid cells indicating kelp presence within a linear distance of 300 m. We employed the KD in our kelp filter algorithm to estimate the global extent of giant kelp and intertidal green algae per marine ecoregion and province, producing a high-resolution global map of giant kelp and intertidal green algae, powered by Google Earth Engine.
Collapse
|