1
|
Picek L, Šulc M, Patel Y, Matas J. Plant recognition by AI: Deep neural nets, transformers, and kNN in deep embeddings. Front Plant Sci 2022; 13:787527. [PMID: 36237508 PMCID: PMC9551576 DOI: 10.3389/fpls.2022.787527] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2021] [Accepted: 07/15/2022] [Indexed: 06/16/2023]
Abstract
The article reviews and benchmarks machine learning methods for automatic image-based plant species recognition and proposes a novel retrieval-based method for recognition by nearest neighbor classification in a deep embedding space. The image retrieval method relies on a model trained via the Recall@k surrogate loss. State-of-the-art approaches to image classification, based on Convolutional Neural Networks (CNN) and Vision Transformers (ViT), are benchmarked and compared with the proposed image retrieval-based method. The impact of performance-enhancing techniques, e.g., class prior adaptation, image augmentations, learning rate scheduling, and loss functions, is studied. The evaluation is carried out on the PlantCLEF 2017, the ExpertLifeCLEF 2018, and the iNaturalist 2018 Datasets-the largest publicly available datasets for plant recognition. The evaluation of CNN and ViT classifiers shows a gradual improvement in classification accuracy. The current state-of-the-art Vision Transformer model, ViT-Large/16, achieves 91.15% and 83.54% accuracy on the PlantCLEF 2017 and ExpertLifeCLEF 2018 test sets, respectively; the best CNN model (ResNeSt-269e) error rate dropped by 22.91% and 28.34%. Apart from that, additional tricks increased the performance for the ViT-Base/32 by 3.72% on ExpertLifeCLEF 2018 and by 4.67% on PlantCLEF 2017. The retrieval approach achieved superior performance in all measured scenarios with accuracy margins of 0.28%, 4.13%, and 10.25% on ExpertLifeCLEF 2018, PlantCLEF 2017, and iNat2018-Plantae, respectively.
Collapse
Affiliation(s)
- Lukáš Picek
- Department of Cybernetics, Faculty of Applied Sciences, University of West Bohemia, Pilsen, Czechia
| | - Milan Šulc
- Visual Recognition Group, Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Prague, Czechia
| | - Yash Patel
- Visual Recognition Group, Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Prague, Czechia
| | - Jiří Matas
- Visual Recognition Group, Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Prague, Czechia
| |
Collapse
|
2
|
Picek L, Šulc M, Matas J, Heilmann-Clausen J, Jeppesen TS, Lind E. Automatic Fungi Recognition: Deep Learning Meets Mycology. Sensors (Basel) 2022; 22:s22020633. [PMID: 35062595 PMCID: PMC8779018 DOI: 10.3390/s22020633] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Revised: 01/03/2022] [Accepted: 01/04/2022] [Indexed: 02/04/2023]
Abstract
The article presents an AI-based fungi species recognition system for a citizen-science community. The system's real-time identification too - FungiVision - with a mobile application front-end, led to increased public interest in fungi, quadrupling the number of citizens collecting data. FungiVision, deployed with a human-in-the-loop, reaches nearly 93% accuracy. Using the collected data, we developed a novel fine-grained classification dataset - Danish Fungi 2020 (DF20) - with several unique characteristics: species-level labels, a small number of errors, and rich observation metadata. The dataset enables the testing of the ability to improve classification using metadata, e.g., time, location, habitat and substrate, facilitates classifier calibration testing and finally allows the study of the impact of the device settings on the classification performance. The continual flow of labelled data supports improvements of the online recognition system. Finally, we present a novel method for the fungi recognition service, based on a Vision Transformer architecture. Trained on DF20 and exploiting available metadata, it achieves a recognition error that is 46.75% lower than the current system. By providing a stream of labeled data in one direction, and an accuracy increase in the other, the collaboration creates a virtuous cycle helping both communities.
Collapse
Affiliation(s)
- Lukáš Picek
- Department of Cybernetics, Faculty of Applied Sciences, University of West Bohemia, 30100 Pilsen, Czech Republic
- Correspondence: or
| | - Milan Šulc
- Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, 16636 Prague, Czech Republic; (M.Š.); (J.M.)
| | - Jiří Matas
- Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, 16636 Prague, Czech Republic; (M.Š.); (J.M.)
| | - Jacob Heilmann-Clausen
- Center for Macroecology, Evolution and Climate, Biological Institute, University of Copenhagen, 1165 Copenhagen, Denmark; (J.H.-C.); (E.L.)
| | | | - Emil Lind
- Center for Macroecology, Evolution and Climate, Biological Institute, University of Copenhagen, 1165 Copenhagen, Denmark; (J.H.-C.); (E.L.)
| |
Collapse
|
3
|
Šulc M, Štětková G, Jelínek V, Czyż B, Dyrcz A, Karpińska O, Kamionka-Kanclerska K, Rowiński P, Maziarz M, Gruszczyński A, Hughes A, Honza M. Killing behaviour of adult brood parasites. BEHAVIOUR 2020. [DOI: 10.1163/1568539x-bja10033] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Abstract
Decades of studies have revealed the striking adaptations of avian brood parasites for their unique reproductive lifestyle. Several have reported that adult brood parasites sometimes kill host nestlings, although the reasons for this behaviour remain unclear. Using continuous video-recording and camera traps, we observed the same behaviour in the common cuckoo Cuculus canorus, showing that both host and parasite nestlings can be killed. The latter has never previously been observed in cuckoos. Here, we review this phenomenon and discuss possible explanations.
Collapse
Affiliation(s)
- M. Šulc
- aInstitute of Vertebrate Biology of the Czech Academy of Sciences, Brno, Czech Republic
| | - G. Štětková
- aInstitute of Vertebrate Biology of the Czech Academy of Sciences, Brno, Czech Republic
- bDepartment of Botany and Zoology, Faculty of Sciences, Masaryk University, Brno, Czech Republic
| | - V. Jelínek
- aInstitute of Vertebrate Biology of the Czech Academy of Sciences, Brno, Czech Republic
| | - B. Czyż
- cDepartment of Behavioural Ecology, University of Wrocław, Wrocław, Poland
| | - A. Dyrcz
- cDepartment of Behavioural Ecology, University of Wrocław, Wrocław, Poland
| | - O. Karpińska
- dDepartment of Forest Zoology and Wildlife Management, Warsaw University of Life Sciences (SGGW), Warsaw, Poland
| | - K. Kamionka-Kanclerska
- dDepartment of Forest Zoology and Wildlife Management, Warsaw University of Life Sciences (SGGW), Warsaw, Poland
| | - P. Rowiński
- dDepartment of Forest Zoology and Wildlife Management, Warsaw University of Life Sciences (SGGW), Warsaw, Poland
| | - M. Maziarz
- eMuseum and Institute of Zoology, Polish Academy of Sciences, Warsaw, Poland
| | - A. Gruszczyński
- eMuseum and Institute of Zoology, Polish Academy of Sciences, Warsaw, Poland
| | - A.E. Hughes
- fDepartment of Psychology, University of Essex, Colchester, UK
| | - M. Honza
- aInstitute of Vertebrate Biology of the Czech Academy of Sciences, Brno, Czech Republic
| |
Collapse
|
4
|
Abstract
BACKGROUND Fine-grained recognition of plants from images is a challenging computer vision task, due to the diverse appearance and complex structure of plants, high intra-class variability and small inter-class differences. We review the state-of-the-art and discuss plant recognition tasks, from identification of plants from specific plant organs to general plant recognition "in the wild". RESULTS We propose texture analysis and deep learning methods for different plant recognition tasks. The methods are evaluated and compared them to the state-of-the-art. Texture analysis is only applied to images with unambiguous segmentation (bark and leaf recognition), whereas CNNs are only applied when sufficiently large datasets are available. The results provide an insight in the complexity of different plant recognition tasks. The proposed methods outperform the state-of-the-art in leaf and bark classification and achieve very competitive results in plant recognition "in the wild". CONCLUSIONS The results suggest that recognition of segmented leaves is practically a solved problem, when high volumes of training data are available. The generality and higher capacity of state-of-the-art CNNs makes them suitable for plant recognition "in the wild" where the views on plant organs or plants vary significantly and the difficulty is increased by occlusions and background clutter.
Collapse
Affiliation(s)
- Milan Šulc
- Department of Cybernetics, FEE CTU in Prague, Karlovo namesti 13, 121 35 Prague 2, Czech Republic
| | - Jiří Matas
- Department of Cybernetics, FEE CTU in Prague, Karlovo namesti 13, 121 35 Prague 2, Czech Republic
| |
Collapse
|
5
|
Kunc Š, Šulc M. High Frequency Modulation Method for Measuring of Birefringence. EPJ Web of Conferences 2013. [DOI: 10.1051/epjconf/20134800012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
6
|
Šulc M, Kramer D, Polak J, Steiger L, Finger M, Slunecka M. New optics for resolution improving of Ring Imaging Cherenkov detectors. EPJ Web of Conferences 2013. [DOI: 10.1051/epjconf/20134800024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
7
|
Šulc M, Hodek P, Stiborová M. The binding affinity of carcinogenic N-nitrosodimethylamine and N-nitrosomethylaniline to cytochromes P450 2B4, 2E1 and 3A6 does not dictate the rate of their enzymatic N-demethylation. Gen Physiol Biophys 2010; 29:175-85. [DOI: 10.4149/gpb_2010_02_175] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
8
|
Šulc M, Nikl M, Vognar M, Blažek K, Nejezchleb K, Boháček P, Nitsch K, Kobayashi M, Usuki Y, Shen D. On-line induced absorption measurement on PbWO4, YAlO3:Ce and CsI scintillating crystals. RADIAT MEAS 2004. [DOI: 10.1016/j.radmeas.2004.03.004] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|