1
|
Wu L, Shao H, Li J, Chen C, Hu N, Yang B, Weng H, Xiang L, Ye D. Noninvasive Abiotic Stress Phenotyping of Vascular Plant in Each Vegetative Organ View. PLANT PHENOMICS (WASHINGTON, D.C.) 2024; 6:0180. [PMID: 38779576 PMCID: PMC11109595 DOI: 10.34133/plantphenomics.0180] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/29/2023] [Accepted: 03/29/2024] [Indexed: 05/25/2024]
Abstract
The last decades have witnessed a rapid development of noninvasive plant phenotyping, capable of detecting plant stress scale levels from the subcellular to the whole population scale. However, even with such a broad range, most phenotyping objects are often just concerned with leaves. This review offers a unique perspective of noninvasive plant stress phenotyping from a multi-organ view. First, plant sensing and responding to abiotic stress from the diverse vegetative organs (leaves, stems, and roots) and the interplays between these vital components are analyzed. Then, the corresponding noninvasive optical phenotyping techniques are also provided, which can prompt the practical implementation of appropriate noninvasive phenotyping techniques for each organ. Furthermore, we explore methods for analyzing compound stress situations, as field conditions frequently encompass multiple abiotic stressors. Thus, our work goes beyond the conventional approach of focusing solely on individual plant organs. The novel insights of the multi-organ, noninvasive phenotyping study provide a reference for testing hypotheses concerning the intricate dynamics of plant stress responses, as well as the potential interactive effects among various stressors.
Collapse
Affiliation(s)
- Libin Wu
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou, Fujian 350002, China
| | - Han Shao
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Center for Artificial Intelligence in Agriculture, School of Future Technology,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Jiayi Li
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou, Fujian 350002, China
| | - Chen Chen
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou, Fujian 350002, China
| | - Nana Hu
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Center for Artificial Intelligence in Agriculture, School of Future Technology,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Biyun Yang
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou, Fujian 350002, China
| | - Haiyong Weng
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou, Fujian 350002, China
| | - Lirong Xiang
- Department of Biological and Agricultural Engineering,
North Carolina State University, Raleigh, NC 27606, USA
| | - Dapeng Ye
- College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering,
Fujian Agriculture and Forestry University, Fuzhou, Fujian 350002, China
| |
Collapse
|
2
|
Zhang F, Wang B, Lu F, Zhang X. Rotating Stomata Measurement Based on Anchor-Free Object Detection and Stomata Conductance Calculation. PLANT PHENOMICS (WASHINGTON, D.C.) 2023; 5:0106. [PMID: 37817885 PMCID: PMC10561978 DOI: 10.34133/plantphenomics.0106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Accepted: 09/25/2023] [Indexed: 10/12/2023]
Abstract
Stomata play an essential role in regulating water and carbon dioxide levels in plant leaves, which is important for photosynthesis. Previous deep learning-based plant stomata detection methods are based on horizontal detection. The detection anchor boxes of deep learning model are horizontal, while the angle of stomata is randomized, so it is not possible to calculate stomata traits directly from the detection anchor boxes. Additional processing of image (e.g., rotating image) is required before detecting stomata and calculating stomata traits. This paper proposes a novel approach, named DeepRSD (deep learning-based rotating stomata detection), for detecting rotating stomata and calculating stomata basic traits at the same time. Simultaneously, the stomata conductance loss function is introduced in the DeepRSD model training, which improves the efficiency of stomata detection and conductance calculation. The experimental results demonstrate that the DeepRSD model reaches 94.3% recognition accuracy for stomata of maize leaf. The proposed method can help researchers conduct large-scale studies on stomata morphology, structure, and stomata conductance models.
Collapse
Affiliation(s)
- Fan Zhang
- Huaihe Hospital of Henan University, Kaifeng 475004, China
- Henan Key Laboratory of Big Data Analysis and Processing,
Henan University, Kaifeng 475004, China
| | - Bo Wang
- Henan Key Laboratory of Big Data Analysis and Processing,
Henan University, Kaifeng 475004, China
| | - Fuhao Lu
- State Key Laboratory of Crop Stress Adaptation and Improvement,
Henan University, Kaifeng 475004, China
| | - Xinhong Zhang
- School of Software,
Henan University, Kaifeng 475004, China
| |
Collapse
|
3
|
Sai N, Bockman JP, Chen H, Watson-Haigh N, Xu B, Feng X, Piechatzek A, Shen C, Gilliham M. StomaAI: an efficient and user-friendly tool for measurement of stomatal pores and density using deep computer vision. THE NEW PHYTOLOGIST 2023; 238:904-915. [PMID: 36683442 DOI: 10.1111/nph.18765] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/31/2022] [Accepted: 12/23/2022] [Indexed: 06/17/2023]
Abstract
Using microscopy to investigate stomatal behaviour is common in plant physiology research. Manual inspection and measurement of stomatal pore features is low throughput, relies upon expert knowledge to record stomatal features accurately, requires significant researcher time and investment, and can represent a significant bottleneck to research pipelines. To alleviate this, we introduce StomaAI (SAI): a reliable, user-friendly and adaptable tool for stomatal pore and density measurements via the application of deep computer vision, which has been initially calibrated and deployed for the model plant Arabidopsis (dicot) and the crop plant barley (monocot grass). SAI is capable of producing measurements consistent with human experts and successfully reproduced conclusions of published datasets. SAI boosts the number of images that can be evaluated in a fraction of the time, so can obtain a more accurate representation of stomatal traits than is routine through manual measurement. An online demonstration of SAI is hosted at https://sai.aiml.team, and the full local application is publicly available for free on GitHub through https://github.com/xdynames/sai-app.
Collapse
Affiliation(s)
- Na Sai
- Plant Transport and Signalling Lab, ARC Centre of Excellence in Plant Energy Biology, Waite Research Institute, Glen Osmond, SA, 5064, Australia
- School of Agriculture, Food and Wine, University of Adelaide, Adelaide, SA, 5064, Australia
| | - James Paul Bockman
- The Australian Institute for Machine Learning, Adelaide, SA, 5005, Australia
- School of Computer Science, University of Adelaide, Adelaide, SA, 5005, Australia
| | - Hao Chen
- The Australian Institute for Machine Learning, Adelaide, SA, 5005, Australia
- School of Computer Science, University of Adelaide, Adelaide, SA, 5005, Australia
| | - Nathan Watson-Haigh
- South Australian Genomics Centre, SAHMRI, Adelaide, SA, 5000, Australia
- Australian Genome Research Facility, Victorian Comprehensive Cancer Centre, Melbourne, Vic., 3000, Australia
| | - Bo Xu
- Plant Transport and Signalling Lab, ARC Centre of Excellence in Plant Energy Biology, Waite Research Institute, Glen Osmond, SA, 5064, Australia
- School of Agriculture, Food and Wine, University of Adelaide, Adelaide, SA, 5064, Australia
| | - Xueying Feng
- Plant Transport and Signalling Lab, ARC Centre of Excellence in Plant Energy Biology, Waite Research Institute, Glen Osmond, SA, 5064, Australia
- School of Agriculture, Food and Wine, University of Adelaide, Adelaide, SA, 5064, Australia
| | - Adriane Piechatzek
- Plant Transport and Signalling Lab, ARC Centre of Excellence in Plant Energy Biology, Waite Research Institute, Glen Osmond, SA, 5064, Australia
- School of Agriculture, Food and Wine, University of Adelaide, Adelaide, SA, 5064, Australia
| | - Chunhua Shen
- The Australian Institute for Machine Learning, Adelaide, SA, 5005, Australia
- School of Computer Science, University of Adelaide, Adelaide, SA, 5005, Australia
| | - Matthew Gilliham
- Plant Transport and Signalling Lab, ARC Centre of Excellence in Plant Energy Biology, Waite Research Institute, Glen Osmond, SA, 5064, Australia
- School of Agriculture, Food and Wine, University of Adelaide, Adelaide, SA, 5064, Australia
| |
Collapse
|
4
|
Yang XH, Xi ZJ, Li JP, Feng XL, Zhu XH, Guo SY, Song CP. Deep Transfer Learning-Based Multi-Object Detection for Plant Stomata Phenotypic Traits Intelligent Recognition. IEEE/ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS 2023; 20:321-329. [PMID: 34941519 DOI: 10.1109/tcbb.2021.3137810] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Plant stomata phenotypic traits can provide a basis for enhancing crop tolerance in adversity. Manually counting the number of stomata and measuring the height and width of stomata obviously cannot satisfy the high-throughput data. How to detect and recognize plant stomata quickly and accurately is the prerequisite and key for studying the physiological characteristics of stomata. In this research, we consider stomata recognition as a multi-object detection problem, and propose an end-to-end framework for intelligent detection and recognition of plant stomata based on feature weights transfer learning and YOLOv4 network. It is easy to operate and greatly facilitates the analysis of stomata phenotypic traits in high-throughput plant epidermal cell images. For different cultivars, multi-scales, rich background features, high density, and small stomata object images, the proposed method can precisely locate multiple stomata in microscope images and automatically give phenotypic traits of stomata. Users can also adjust the corresponding parameters to maximize the accuracy and scalability of automatic stomata detection and recognition. Experimental results on actual data provided by the National Maize Improvement Center show that the proposed method is superior to the existing methods in high stomata automatic detection and recognition accuracy, low training cost, strong generalization ability.
Collapse
|
5
|
Liang X, Xu X, Wang Z, He L, Zhang K, Liang B, Ye J, Shi J, Wu X, Dai M, Yang W. StomataScorer: a portable and high-throughput leaf stomata trait scorer combined with deep learning and an improved CV model. PLANT BIOTECHNOLOGY JOURNAL 2022; 20:577-591. [PMID: 34717024 PMCID: PMC8882810 DOI: 10.1111/pbi.13741] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2021] [Revised: 09/26/2021] [Accepted: 10/16/2021] [Indexed: 05/05/2023]
Abstract
To measure stomatal traits automatically and nondestructively, a new method for detecting stomata and extracting stomatal traits was proposed. Two portable microscopes with different resolutions (TipScope with a 40× lens attached to a smartphone and ProScope HR2 with a 400× lens) are used to acquire images of living stomata in maize leaves. FPN model was used to detect stomata in the TipScope images and measure the stomata number and stomatal density. Faster RCNN model was used to detect opening and closing stomata in the ProScope HR2 images, and the number of opening and closing stomata was measured. An improved CV model was used to segment pores of opening stomata, and a total of 6 pore traits were measured. Compared to manual measurements, the square of the correlation coefficient (R2 ) of the 6 pore traits was higher than 0.85, and the mean absolute percentage error (MAPE) of these traits was 0.02%-6.34%. The dynamic stomata changes between wild-type B73 and mutant Zmfab1a were explored under drought and re-watering condition. The results showed that Zmfab1a had a higher resilience than B73 on leaf stomata. In addition, the proposed method was tested to measure the leaf stomatal traits of other nine species. In conclusion, a portable and low-cost stomata phenotyping method that could accurately and dynamically measure the characteristic parameters of living stomata was developed. An open-access and user-friendly web portal was also developed which has the potential to be used in the stomata phenotyping of large populations in the future.
Collapse
Affiliation(s)
- Xiuying Liang
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Xichen Xu
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Zhiwei Wang
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Lei He
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Kaiqi Zhang
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Bo Liang
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Junli Ye
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Jiawei Shi
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Xi Wu
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Mingqiu Dai
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
| | - Wanneng Yang
- National Key Laboratory of Crop Genetic ImprovementNational Center of Plant Gene Research (Wuhan)College of EngineeringHuazhong Agricultural UniversityWuhanChina
- Shenzhen BranchGuangdong Laboratory for Lingnan Modern AgricultureGenome Analysis Laboratory of the Ministry of AgricultureAgricultural Genomics Institute at ShenzhenChinese Academy of Agricultural SciencesShenzhenChina
| |
Collapse
|
6
|
Gibbs JA, Mcausland L, Robles-Zazueta CA, Murchie EH, Burgess AJ. A Deep Learning Method for Fully Automatic Stomatal Morphometry and Maximal Conductance Estimation. FRONTIERS IN PLANT SCIENCE 2021; 12:780180. [PMID: 34925424 PMCID: PMC8675901 DOI: 10.3389/fpls.2021.780180] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2021] [Accepted: 11/01/2021] [Indexed: 06/14/2023]
Abstract
Stomata are integral to plant performance, enabling the exchange of gases between the atmosphere and the plant. The anatomy of stomata influences conductance properties with the maximal conductance rate, g smax, calculated from density and size. However, current calculations of stomatal dimensions are performed manually, which are time-consuming and error prone. Here, we show how automated morphometry from leaf impressions can predict a functional property: the anatomical gsmax. A deep learning network was derived to preserve stomatal morphometry via semantic segmentation. This forms part of an automated pipeline to measure stomata traits for the estimation of anatomical gsmax. The proposed pipeline achieves accuracy of 100% for the distinction (wheat vs. poplar) and detection of stomata in both datasets. The automated deep learning-based method gave estimates for gsmax within 3.8 and 1.9% of those values manually calculated from an expert for a wheat and poplar dataset, respectively. Semantic segmentation provides a rapid and repeatable method for the estimation of anatomical gsmax from microscopic images of leaf impressions. This advanced method provides a step toward reducing the bottleneck associated with plant phenotyping approaches and will provide a rapid method to assess gas fluxes in plants based on stomata morphometry.
Collapse
Affiliation(s)
- Jonathon A. Gibbs
- School of Computer Science, University of Nottingham, Nottingham, United Kingdom
| | - Lorna Mcausland
- School of Biosciences, University of Nottingham, Loughborough, United Kingdom
| | | | - Erik H. Murchie
- School of Biosciences, University of Nottingham, Loughborough, United Kingdom
| | | |
Collapse
|
7
|
Jayakody H, Petrie P, Boer HJD, Whitty M. A generalised approach for high-throughput instance segmentation of stomata in microscope images. PLANT METHODS 2021; 17:27. [PMID: 33750422 PMCID: PMC7945362 DOI: 10.1186/s13007-021-00727-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/02/2020] [Accepted: 02/26/2021] [Indexed: 05/05/2023]
Abstract
BACKGROUND Stomata analysis using microscope imagery provides important insight into plant physiology, health and the surrounding environmental conditions. Plant scientists are now able to conduct automated high-throughput analysis of stomata in microscope data, however, existing detection methods are sensitive to the appearance of stomata in the training images, thereby limiting general applicability. In addition, existing methods only generate bounding-boxes around detected stomata, which require users to implement additional image processing steps to study stomata morphology. In this paper, we develop a fully automated, robust stomata detection algorithm which can also identify individual stomata boundaries regardless of the plant species, sample collection method, imaging technique and magnification level. RESULTS The proposed solution consists of three stages. First, the input image is pre-processed to remove any colour space biases occurring from different sample collection and imaging techniques. Then, a Mask R-CNN is applied to estimate individual stomata boundaries. The feature pyramid network embedded in the Mask R-CNN is utilised to identify stomata at different scales. Finally, a statistical filter is implemented at the Mask R-CNN output to reduce the number of false positive generated by the network. The algorithm was tested using 16 datasets from 12 sources, containing over 60,000 stomata. For the first time in this domain, the proposed solution was tested against 7 microscope datasets never seen by the algorithm to show the generalisability of the solution. Results indicated that the proposed approach can detect stomata with a precision, recall, and F-score of 95.10%, 83.34%, and 88.61%, respectively. A separate test conducted by comparing estimated stomata boundary values with manually measured data showed that the proposed method has an IoU score of 0.70; a 7% improvement over the bounding-box approach. CONCLUSIONS The proposed method shows robust performance across multiple microscope image datasets of different quality and scale. This generalised stomata detection algorithm allows plant scientists to conduct stomata analysis whilst eliminating the need to re-label and re-train for each new dataset. The open-source code shared with this project can be directly deployed in Google Colab or any other Tensorflow environment.
Collapse
Affiliation(s)
- Hiranya Jayakody
- School of Mechanical and Manufacturing Engineering, UNSW, Sydney, Australia
| | - Paul Petrie
- School of Mechanical and Manufacturing Engineering, UNSW, Sydney, Australia
- South Australian Research and Development Institute, Urrbrae, Australia
| | - Hugo Jan de Boer
- Department of Environmental Sciences, Copernicus institute of sustainable development, Utrecht University, Utrecht, Netherlands
| | - Mark Whitty
- School of Mechanical and Manufacturing Engineering, UNSW, Sydney, Australia
| |
Collapse
|
8
|
Biswas S, Barma S. A large-scale optical microscopy image dataset of potato tuber for deep learning based plant cell assessment. Sci Data 2020; 7:371. [PMID: 33110087 PMCID: PMC7591917 DOI: 10.1038/s41597-020-00706-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2020] [Accepted: 09/16/2020] [Indexed: 02/07/2023] Open
Abstract
We present a new large-scale three-fold annotated microscopy image dataset, aiming to advance the plant cell biology research by exploring different cell microstructures including cell size and shape, cell wall thickness, intercellular space, etc. in deep learning (DL) framework. This dataset includes 9,811 unstained and 6,127 stained (safranin-o, toluidine blue-o, and lugol's-iodine) images with three-fold annotation including physical, morphological, and tissue grading based on weight, different section area, and tissue zone respectively. In addition, we prepared ground truth segmentation labels for three different tuber weights. We have validated the pertinence of annotations by performing multi-label cell classification, employing convolutional neural network (CNN), VGG16, for unstained and stained images. The accuracy has been achieved up to 0.94, while, F2-score reaches to 0.92. Furthermore, the ground truth labels have been verified by semantic segmentation algorithm using UNet architecture which presents the mean intersection of union up to 0.70. Hence, the overall results show that the data are very much efficient and could enrich the domain of microscopy plant cell analysis for DL-framework.
Collapse
Affiliation(s)
- Sumona Biswas
- Department of Electronics and Communication Engineering, Indian Institute of Information Technology Guwahati, Guwahati, Assam, India.
| | - Shovan Barma
- Department of Electronics and Communication Engineering, Indian Institute of Information Technology Guwahati, Guwahati, Assam, India.
| |
Collapse
|