1
|
El Ghazouali S, Mhirit Y, Oukhrid A, Michelucci U, Nouira H. FusionVision: A Comprehensive Approach of 3D Object Reconstruction and Segmentation from RGB-D Cameras Using YOLO and Fast Segment Anything. SENSORS (BASEL, SWITZERLAND) 2024; 24:2889. [PMID: 38732995 PMCID: PMC11086350 DOI: 10.3390/s24092889] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/29/2024] [Revised: 04/22/2024] [Accepted: 04/29/2024] [Indexed: 05/13/2024]
Abstract
In the realm of computer vision, the integration of advanced techniques into the pre-processing of RGB-D camera inputs poses a significant challenge, given the inherent complexities arising from diverse environmental conditions and varying object appearances. Therefore, this paper introduces FusionVision, an exhaustive pipeline adapted for the robust 3D segmentation of objects in RGB-D imagery. Traditional computer vision systems face limitations in simultaneously capturing precise object boundaries and achieving high-precision object detection on depth maps, as they are mainly proposed for RGB cameras. To address this challenge, FusionVision adopts an integrated approach by merging state-of-the-art object detection techniques, with advanced instance segmentation methods. The integration of these components enables a holistic (unified analysis of information obtained from both color RGB and depth D channels) interpretation of RGB-D data, facilitating the extraction of comprehensive and accurate object information in order to improve post-processes such as object 6D pose estimation, Simultanious Localization and Mapping (SLAM) operations, accurate 3D dataset extraction, etc. The proposed FusionVision pipeline employs YOLO for identifying objects within the RGB image domain. Subsequently, FastSAM, an innovative semantic segmentation model, is applied to delineate object boundaries, yielding refined segmentation masks. The synergy between these components and their integration into 3D scene understanding ensures a cohesive fusion of object detection and segmentation, enhancing overall precision in 3D object segmentation.
Collapse
Affiliation(s)
| | | | - Ali Oukhrid
- Independent Researcher, 2502 Biel/Bienne, Switzerland
| | | | - Hichem Nouira
- LNE Laboratoire National de Metrologie et d’Essaies, 75015 Paris, France;
| |
Collapse
|
2
|
Zidek J, Sudakova A, Smilek J, Nguyen DA, Ngoc HL, Ha LM. Explorative Image Analysis of Methylene Blue Interactions with Gelatin in Polypropylene Nonwoven Fabric Membranes: A Potential Future Tool for the Characterization of the Diffusion Process. Gels 2023; 9:888. [PMID: 37998978 PMCID: PMC10671130 DOI: 10.3390/gels9110888] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Revised: 10/31/2023] [Accepted: 11/03/2023] [Indexed: 11/25/2023] Open
Abstract
This manuscript explores the interaction between methylene blue dye and gelatin within a membrane using spectroscopy and image analysis. Emphasis is placed on methylene blue's unique properties, specifically its ability to oscillate between two distinct resonance states, each with unique light absorption characteristics. Image analysis serves as a tool for examining dye diffusion and absorption. The results indicate a correlation between dye concentrations and membrane thickness. Thinner layers exhibit a consistent dye concentration, implying an even distribution of the dye during the diffusion process. However, thicker layers display varying concentrations at different edges, suggesting the establishment of a diffusion gradient. Moreover, the authors observe an increased concentration of gelatin at the peripheries rather than at the center, possibly due to the swelling of the dried sample and a potential water concentration gradient. The manuscript concludes by suggesting image analysis as a practical alternative to spectral analysis, particularly for detecting whether methylene blue has been adsorbed onto the macromolecular network. These findings significantly enhance the understanding of the complex interactions between methylene blue and gelatin in a membrane and lay a solid foundation for future research in this field.
Collapse
Affiliation(s)
- Jan Zidek
- Central European Institute of Technology (CEITEC), Brno University of Technology, Purkynova 123, 612 00 Brno, Czech Republic
| | - Anna Sudakova
- Central European Institute of Technology (CEITEC), Brno University of Technology, Purkynova 123, 612 00 Brno, Czech Republic
- Faculty of Chemistry, Brno University of Technology, Purkynova 464/118, 612 00 Brno, Czech Republic
| | - Jiri Smilek
- Faculty of Chemistry, Brno University of Technology, Purkynova 464/118, 612 00 Brno, Czech Republic
| | - Duc Anh Nguyen
- Center for Research and Technology Transfer (CRETECH), Vietnam Academy of Science and Technology (VAST), 18-Hoang Quoc Viet, Nghia Do, Cau Giay, Hanoi 100000, Vietnam (H.L.N.)
| | - Hung Le Ngoc
- Center for Research and Technology Transfer (CRETECH), Vietnam Academy of Science and Technology (VAST), 18-Hoang Quoc Viet, Nghia Do, Cau Giay, Hanoi 100000, Vietnam (H.L.N.)
- Graduate University of Science and Technology (GUST), Vietnam Academy of Science and Technology (VAST), 18-Hoang Quoc Viet, Nghia Do, Cau Giay, Hanoi 100000, Vietnam
| | - Le Minh Ha
- Institute of Natural Products Chemistry (INPC), Vietnam Academy of Science and Technology (VAST), 18-Hoang Quoc Viet, Nghia Do, Cau Giay, Hanoi 100000, Vietnam;
| |
Collapse
|
3
|
Harandi N, Vandenberghe B, Vankerschaver J, Depuydt S, Van Messem A. How to make sense of 3D representations for plant phenotyping: a compendium of processing and analysis techniques. PLANT METHODS 2023; 19:60. [PMID: 37353846 DOI: 10.1186/s13007-023-01031-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 05/19/2023] [Indexed: 06/25/2023]
Abstract
Computer vision technology is moving more and more towards a three-dimensional approach, and plant phenotyping is following this trend. However, despite its potential, the complexity of the analysis of 3D representations has been the main bottleneck hindering the wider deployment of 3D plant phenotyping. In this review we provide an overview of typical steps for the processing and analysis of 3D representations of plants, to offer potential users of 3D phenotyping a first gateway into its application, and to stimulate its further development. We focus on plant phenotyping applications where the goal is to measure characteristics of single plants or crop canopies on a small scale in research settings, as opposed to large scale crop monitoring in the field.
Collapse
Affiliation(s)
- Negin Harandi
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | | | - Joris Vankerschaver
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | - Stephen Depuydt
- Erasmus Applied University of Sciences and Arts, Campus Kaai, Nijverheidskaai 170, Anderlecht, Belgium
| | - Arnout Van Messem
- Department of Mathematics, Université de Liège, Allée de la Découverte 12, Liège, Belgium.
| |
Collapse
|
4
|
Liu Y, Yuan H, Zhao X, Fan C, Cheng M. Fast reconstruction method of three-dimension model based on dual RGB-D cameras for peanut plant. PLANT METHODS 2023; 19:17. [PMID: 36843020 PMCID: PMC9969713 DOI: 10.1186/s13007-023-00998-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/31/2022] [Accepted: 02/20/2023] [Indexed: 06/18/2023]
Abstract
BACKGROUND Plant shape and structure are important factors in peanut breeding research. Constructing a three-dimension (3D) model can provide an effective digital tool for comprehensive and quantitative analysis of peanut plant structure. Fast and accurate are always the goals of the plant 3D model reconstruction research. RESULTS We proposed a 3D reconstruction method based on dual RGB-D cameras for the peanut plant 3D model quickly and accurately. The two Kinect v2 were mirror symmetry placed on both sides of the peanut plant, and the point cloud data obtained were filtered twice to remove noise interference. After rotation and translation based on the corresponding geometric relationship, the point cloud acquired by the two Kinect v2 was converted to the same coordinate system and spliced into the 3D structure of the peanut plant. The experiment was conducted at various growth stages based on twenty potted peanuts. The plant traits' height, width, length, and volume were calculated through the reconstructed 3D models, and manual measurement was also carried out during the experiment processing. The accuracy of the 3D model was evaluated through a synthetic coefficient, which was generated by calculating the average accuracy of the four traits. The test result showed that the average accuracy of the reconstructed peanut plant 3D model by this method is 93.42%. A comparative experiment with the iterative closest point (ICP) algorithm, a widely used 3D modeling algorithm, was additionally implemented to test the rapidity of this method. The test result shows that the proposed method is 2.54 times faster with approximated accuracy compared to the ICP method. CONCLUSIONS The reconstruction method for the 3D model of the peanut plant described in this paper is capable of rapidly and accurately establishing a 3D model of the peanut plant while also meeting the modeling requirements for other species' breeding processes. This study offers a potential tool to further explore the 3D model for improving traits and agronomic qualities of plants.
Collapse
Affiliation(s)
- Yadong Liu
- College of Mechanical and Electrical Engineering, Hebei Agricultural University, Baoding, 071001, China
| | - Hongbo Yuan
- College of Mechanical and Electrical Engineering, Hebei Agricultural University, Baoding, 071001, China
| | - Xin Zhao
- College of Mechanical and Electrical Engineering, Hebei Agricultural University, Baoding, 071001, China
| | - Caihu Fan
- College of Mechanical and Electrical Engineering, Hebei Agricultural University, Baoding, 071001, China
| | - Man Cheng
- College of Mechanical and Electrical Engineering, Hebei Agricultural University, Baoding, 071001, China.
| |
Collapse
|