1
|
Barbosa Júnior MR, Santos RGD, Sales LDA, Oliveira LPD. Advancements in Agricultural Ground Robots for Specialty Crops: An Overview of Innovations, Challenges, and Prospects. PLANTS (BASEL, SWITZERLAND) 2024; 13:3372. [PMID: 39683165 DOI: 10.3390/plants13233372] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/11/2024] [Revised: 11/23/2024] [Accepted: 11/28/2024] [Indexed: 12/18/2024]
Abstract
Robotic technologies are affording opportunities to revolutionize the production of specialty crops (fruits, vegetables, tree nuts, and horticulture). They offer the potential to automate tasks and save inputs such as labor, fertilizer, and pesticides. Specialty crops are well known for their high economic value and nutritional benefits, making their production particularly impactful. While previous review papers have discussed the evolution of agricultural robots in a general agricultural context, this review uniquely focuses on their application to specialty crops, a rapidly expanding area. Therefore, we aimed to develop a state-of-the-art review to scientifically contribute to the understanding of the following: (i) the primary areas of robots' application for specialty crops; (ii) the specific benefits they offer; (iii) their current limitations; and (iv) opportunities for future investigation. We formulated a comprehensive search strategy, leveraging Scopus® and Web of Science™ as databases and selecting "robot" and "specialty crops" as the main keywords. To follow a critical screening process, only peer-reviewed research papers were considered, resulting in the inclusion of 907 papers covering the period from 1988 to 2024. Each paper was thoroughly evaluated based on its title, abstract, keywords, methods, conclusions, and declarations. Our analysis revealed that interest in agricultural robots for specialty crops has significantly increased over the past decade, mainly driven by technological advancements in computer vision and recognition systems. Harvesting robots have arisen as the primary focus. Robots for spraying, pruning, weed control, pollination, transplanting, and fertilizing are emerging subjects to be addressed in further research and development (R&D) strategies. Ultimately, our findings serve to reveal the dynamics of agricultural robots in the world of specialty crops while supporting suitable practices for more sustainable and resilient agriculture, indicating a new era of innovation and efficiency in agriculture.
Collapse
|
2
|
Pinheiro I, Moreira G, Magalhães S, Valente A, Cunha M, Dos Santos FN. Deep learning based approach for actinidia flower detection and gender assessment. Sci Rep 2024; 14:24452. [PMID: 39424618 PMCID: PMC11489756 DOI: 10.1038/s41598-024-73035-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2024] [Accepted: 09/12/2024] [Indexed: 10/21/2024] Open
Abstract
Pollination is critical for crop development, especially those essential for subsistence. This study addresses the pollination challenges faced by Actinidia, a dioecious plant characterized by female and male flowers on separate plants. Despite the high protein content of pollen, the absence of nectar in kiwifruit flowers poses difficulties in attracting pollinators. Consequently, there is a growing interest in using artificial intelligence and robotic solutions to enable pollination even in unfavourable conditions. These robotic solutions must be able to accurately detect flowers and discern their genders for precise pollination operations. Specifically, upon identifying female Actinidia flowers, the robotic system should approach the stigma to release pollen, while male Actinidia flowers should target the anthers to collect pollen. We identified two primary research gaps: (1) the lack of gender-based flower detection methods and (2) the underutilisation of contemporary deep learning models in this domain. To address these gaps, we evaluated the performance of four pretrained models (YOLOv8, YOLOv5, RT-DETR and DETR) in detecting and determining the gender of Actinidia flowers. We outlined a comprehensive methodology and developed a dataset of manually annotated flowers categorized into two classes based on gender. Our evaluation utilised k-fold cross-validation to rigorously test model performance across diverse subsets of the dataset, addressing the limitations of conventional data splitting methods. DETR provided the most balanced overall performance, achieving precision, recall, F1 score and mAP of 89%, 97%, 93% and 94%, respectively, highlighting its robustness in managing complex detection tasks under varying conditions. These findings underscore the potential of deep learning models for effective gender-specific detection of Actinidia flowers, paving the way for advanced robotic pollination systems.
Collapse
Affiliation(s)
- Isabel Pinheiro
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, 4200-465, Portugal.
- School of Science and Technology, University of Trás-os-Montes e Alto Douro, Vila Real, 5000-801, Portugal.
| | - Germano Moreira
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, 4200-465, Portugal
- Faculty of Sciences, University of Porto, Porto, 4169-007, Portugal
| | - Sandro Magalhães
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, 4200-465, Portugal
- Faculty of Engineering, University of Porto, Porto, 4200-465, Portugal
| | - António Valente
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, 4200-465, Portugal
- School of Science and Technology, University of Trás-os-Montes e Alto Douro, Vila Real, 5000-801, Portugal
| | - Mário Cunha
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, 4200-465, Portugal
- Faculty of Sciences, University of Porto, Porto, 4169-007, Portugal
| | - Filipe Neves Dos Santos
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, 4200-465, Portugal
| |
Collapse
|
3
|
Pan F, Hu M, Duan X, Zhang B, Xiang P, Jia L, Zhao X, He D. Enhancing kiwifruit flower pollination detection through frequency domain feature fusion: a novel approach to agricultural monitoring. FRONTIERS IN PLANT SCIENCE 2024; 15:1415884. [PMID: 39119504 PMCID: PMC11306074 DOI: 10.3389/fpls.2024.1415884] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/11/2024] [Accepted: 07/11/2024] [Indexed: 08/10/2024]
Abstract
The pollination process of kiwifruit flowers plays a crucial role in kiwifruit yield. Achieving accurate and rapid identification of the four stages of kiwifruit flowers is essential for enhancing pollination efficiency. In this study, to improve the efficiency of kiwifruit pollination, we propose a novel full-stage kiwifruit flower pollination detection algorithm named KIWI-YOLO, based on the fusion of frequency-domain features. Our algorithm leverages frequency-domain and spatial-domain information to improve recognition of contour-detailed features and integrates decision-making with contextual information. Additionally, we incorporate the Bi-Level Routing Attention (BRA) mechanism with C3 to enhance the algorithm's focus on critical areas, resulting in accurate, lightweight, and fast detection. The algorithm achieves a m A P 0.5 of 91.6% with only 1.8M parameters, the AP of the Female class and the Male class reaches 95% and 93.5%, which is an improvement of 3.8%, 1.2%, and 6.2% compared with the original algorithm. Furthermore, the Recall and F1-score of the algorithm are enhanced by 5.5% and 3.1%, respectively. Moreover, our model demonstrates significant advantages in detection speed, taking only 0.016s to process an image. The experimental results show that the algorithmic model proposed in this study can better assist the pollination of kiwifruit in the process of precision agriculture production and help the development of the kiwifruit industry.
Collapse
Affiliation(s)
- Fei Pan
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Ya’an Digital Agricultural Engineering Technology Research Center, Sichuan Agricultural University, Ya’an, China
| | - Mengdie Hu
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Sichuan Agricultural University, Ya’an, China
| | - Xuliang Duan
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Ya’an Digital Agricultural Engineering Technology Research Center, Sichuan Agricultural University, Ya’an, China
| | - Boda Zhang
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Sichuan Agricultural University, Ya’an, China
| | - Pengjun Xiang
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Sichuan Agricultural University, Ya’an, China
| | - Lan Jia
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Sichuan Agricultural University, Ya’an, China
| | - Xiaoyu Zhao
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Sichuan Agricultural University, Ya’an, China
| | - Dawei He
- College of Information Engineering, Sichuan Agricultural University, Ya’an, China
- Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Sichuan Agricultural University, Ya’an, China
| |
Collapse
|
4
|
Xing Z, Zhang Z, Wang Y, Xu P, Guo Q, Zeng C, Shi R. SDC-DeepLabv3+: Lightweight and Precise Localization Algorithm for Safflower-Harvesting Robots. PLANT PHENOMICS (WASHINGTON, D.C.) 2024; 6:0194. [PMID: 38974378 PMCID: PMC11224119 DOI: 10.34133/plantphenomics.0194] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/06/2024] [Accepted: 05/06/2024] [Indexed: 07/09/2024]
Abstract
Harvesting robots had difficulty extracting filament phenotypes for small, numerous filaments, heavy cross-obscuration, and similar phenotypic characteristics with organs. Robots experience difficulty in localizing under near-colored backgrounds and fuzzy contour features. It cannot accurately harvest filaments for robots. Therefore, a method for detecting and locating filament picking points based on an improved DeepLabv3+ algorithm is proposed in this study. A lightweight network structure, ShuffletNetV2, was used to replace the backbone network Xception of the traditional DeepLabv3+. Convolutional branches for 3 different sampling rates were added to extract information on the safflower features under the receptive field. Convolutional block attention was incorporated into feature extraction at the coding and decoding layers to solve the interference problem of the near-color background in the feature-fusion process. Then, using the region of interest of the safflower branch obtained by the improved DeepLabv3+, an algorithm for filament picking-point localization was designed based on barycenter projection. The tests demonstrated that this method was capable of accurately localizing the filament. The mean pixel accuracy and mean intersection over union of the improved DeepLabv3+ were 95.84% and 96.87%, respectively. The detection rate and weights file size required were superior to those of other algorithms. In the localization test, the depth-measurement distance between the depth camera and target safflower filament was 450 to 510 mm, which minimized the visual-localization error. The average localization and picking success rates were 92.50% and 90.83%, respectively. The results show that the proposed localization method offers a viable approach for accurate harvesting localization.
Collapse
Affiliation(s)
- Zhenyu Xing
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
- Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou 310058, China
| | - Zhenguo Zhang
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
- Key Laboratory of Xinjiang Intelligent Agricultural Equipment, Urumqi 830052, China
| | - Yunze Wang
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
| | - Peng Xu
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
| | - Quanfeng Guo
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
| | - Chao Zeng
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
| | - Ruimeng Shi
- College of Mechanical and Electrical Engineering,
Xinjiang Agricultural University, Urumqi 830052, China
| |
Collapse
|
5
|
Nakatani T, Utsumi Y, Fujimoto K, Iwamura M, Kise K. Image recognition-based petal arrangement estimation. FRONTIERS IN PLANT SCIENCE 2024; 15:1334362. [PMID: 38638358 PMCID: PMC11024381 DOI: 10.3389/fpls.2024.1334362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/07/2023] [Accepted: 02/21/2024] [Indexed: 04/20/2024]
Abstract
Flowers exhibit morphological diversity in the number and positional arrangement of their floral organs, such as petals. The petal arrangements of blooming flowers are represented by the overlap position relation between neighboring petals, an indicator of the floral developmental process; however, only specialists are capable of the petal arrangement identification. Therefore, we propose a method to support the estimation of the arrangement of the perianth organs, including petals and tepals, using image recognition techniques. The problem for realizing the method is that it is not possible to prepare a large number of image datasets: we cannot apply the latest machine learning based image processing methods, which require a large number of images. Therefore, we describe the tepal arrangement as a sequence of interior-exterior patterns of tepal overlap in the image, and estimate the tepal arrangement by matching the pattern with the known patterns. We also use methods that require less or no training data to implement the method: the fine-tuned YOLO v5 model for flower detection, GrubCut for flower segmentation, the Harris corner detector for tepal overlap detection, MAML-based interior-exterior estimation, and circular permutation matching for tepal arrangement estimation. Experimental results showed good accuracy when flower detection, segmentation, overlap location estimation, interior-exterior estimation, and circle permutation matching-based tepal arrangement estimation were evaluated independently. However, the accuracy decreased when they were integrated. Therefore, we developed a user interface for manual correction of the position of overlap estimation and interior-exterior pattern estimation, which ensures the quality of tepal arrangement estimation.
Collapse
Affiliation(s)
- Tomoya Nakatani
- Graduate School of Informatics, Osaka Metropolitan University, Sakai, Japan
| | - Yuzuko Utsumi
- Graduate School of Informatics, Osaka Metropolitan University, Sakai, Japan
| | - Koichi Fujimoto
- Graduate School of Integrated Sciences for Life, Hiroshima University, Higashi-Hiroshima, Japan
| | - Masakazu Iwamura
- Graduate School of Informatics, Osaka Metropolitan University, Sakai, Japan
| | - Koichi Kise
- Graduate School of Informatics, Osaka Metropolitan University, Sakai, Japan
| |
Collapse
|
6
|
Yerebakan MO, Hu B. Wearable Sensors Assess the Effects of Human-Robot Collaboration in Simulated Pollination. SENSORS (BASEL, SWITZERLAND) 2024; 24:577. [PMID: 38257670 PMCID: PMC10821395 DOI: 10.3390/s24020577] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2023] [Revised: 12/28/2023] [Accepted: 01/15/2024] [Indexed: 01/24/2024]
Abstract
Pollination for indoor agriculture is hampered by environmental conditions, requiring farmers to pollinate manually. This increases the musculoskeletal illness risk of workers. A potential solution involves Human-Robot Collaboration (HRC) using wearable sensor-based human motion tracking. However, the physical and biomechanical aspects of human interaction with an advanced and intelligent collaborative robot (cobot) during pollination remain unknown. This study explores the impact of HRC on upper body joint angles during pollination tasks and plant height. HRC generally resulted in a significant reduction in joint angles with flexion decreasing by an average of 32.6 degrees (p ≤ 0.001) for both shoulders and 30.5 degrees (p ≤ 0.001) for the elbows. In addition, shoulder rotation decreased by an average of 19.1 (p ≤ 0.001) degrees. However, HRC increased the left elbow supination by 28.3 degrees (p ≤ 0.001). The positive effects of HRC were reversed when the robot was unreliable (i.e., missed its target), but this effect was not applicable for the left elbow. The effect of plant height was limited with higher plant height increasing right shoulder rotation but decreasing right elbow pronation. These findings aim to shed light on both the benefits and challenges of HRC in agriculture, providing valuable insights before deploying cobots in indoor agricultural settings.
Collapse
Affiliation(s)
| | - Boyi Hu
- Department of Industrial and Systems Engineering, University of Florida, Gainesville, FL 32611, USA;
| |
Collapse
|
7
|
Perception, Path Planning, and Flight Control for a Drone-Enabled Autonomous Pollination System. ROBOTICS 2022. [DOI: 10.3390/robotics11060144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022] Open
Abstract
The decline of natural pollinators necessitates the development of novel pollination technologies. In this work, we propose a drone-enabled autonomous pollination system (APS) that consists of five primary modules: environment sensing, flower perception, path planning, flight control, and pollination mechanisms. These modules are highly dependent upon each other, with each module relying on inputs from the other modules. In this paper, we focus on approaches to the flower perception, path planning, and flight control modules. First, we briefly introduce a flower perception method from our previous work to create a map of flower locations. With a map of flowers, APS path planning is defined as a variant of the Travelling Salesman Problem (TSP). Two path planning approaches are compared based on mixed-integer programming (MIP) and genetic algorithms (GA), respectively. The GA approach is chosen as the superior approach due to the vast computational savings with negligible loss of optimality. To accurately follow the generated path for pollination, we develop a convex optimization approach to the quadrotor flight control problem (QFCP). This approach solves two convex problems. The first problem is a convexified three degree-of-freedom QFCP. The solution to this problem is used as an initial guess to the second convex problem, which is a linearized six degree-of-freedom QFCP. It is found that changing the objective of the second convex problem to minimize the deviation from the initial guess provides improved physical feasibility and solutions similar to a general-purpose optimizer. The path planning and flight control approaches are then tested within a model predictive control (MPC) framework where significant computational savings and embedded adjustments to uncertainty are observed. Coupling the two modules together provides a simple demonstration of how the entire APS will operate in practice.
Collapse
|
8
|
Li Z, Yuan X, Wang C. A review on structural development and recognition–localization methods for end-effector of fruit–vegetable picking robots. INT J ADV ROBOT SYST 2022. [DOI: 10.1177/17298806221104906] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
The excellent performance of fruit and vegetable picking robots is usually contributed by the reasonable structure of end-effector and recognition–localization methods with high accuracy. As a result, efforts are focused on two aspects, and diverse structures of end-effector, target recognition methods as well as their combinations are yielded continuously. A good understanding for the working principle, advantages, limitations, and the adaptability in respective fields is helpful to design picking robots. Therefore, depending on different grasping ways, separating methods, structures, materials, and driving modes, main characteristics existing in traditional schemes will be depicted firstly. According to technical routes, advantages, potential applications, and challenges, underactuated manipulators and soft manipulators representing future development are then summarized systematically. Secondly, partial recognition and localization methods are also demonstrated. Specifically, current recognition manners adopting the single-feature, multi-feature fusion and deep learning are explained in view of their advantages, limitations, and successful instances. In the field of 3D localization, active vision based on the structured light, laser scanning, time of flight, and radar is reflected through the respective applications. Also, another 3D localization method called passive vision is also evaluated by advantages, limitations, the degree of automation, reconstruction effects, and the application scenario, such as monocular vision, binocular vision, and multiocular vision. Finally portrayed from structural development, recognition, and localization methods, it is possible to develop future end-effectors for fruit and vegetable picking robots with superior characteristics containing the less driving element, rigid–flexible–bionic coupling soft manipulators, simple control program, high efficiency, low damage, low cost, high versatility, and high recognition accuracy in all-season picking tasks.
Collapse
Affiliation(s)
- Ziyue Li
- School of Automotive Engineering, Hubei University of Automotive Technology, Shiyan, PR China
| | - Xianju Yuan
- School of Automotive Engineering, Hubei University of Automotive Technology, Shiyan, PR China
- Department of Systems Design Engineering, University of Waterloo, Waterloo, Canada
| | - Chuyan Wang
- School of Automotive Engineering, Hubei University of Automotive Technology, Shiyan, PR China
| |
Collapse
|
9
|
Chen Y, Feng K, Lu J, Hu Z. Machine vision on the positioning accuracy evaluation of poultry viscera in the automatic evisceration robot system. INTERNATIONAL JOURNAL OF FOOD PROPERTIES 2021. [DOI: 10.1080/10942912.2021.1947315] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Yan Chen
- School of Mechanical Engineering, Wuhan Polytechnic University, Wuhan, China
| | - Ke Feng
- School of Mechanical Engineering, Wuhan Polytechnic University, Wuhan, China
| | - Jianjian Lu
- School of Mechanical Engineering, Wuhan Polytechnic University, Wuhan, China
| | - Zhigang Hu
- School of Mechanical Engineering, Wuhan Polytechnic University, Wuhan, China
| |
Collapse
|
10
|
Abstract
The constant advances in agricultural robotics aim to overcome the challenges imposed by population growth, accelerated urbanization, high competitiveness of high-quality products, environmental preservation and a lack of qualified labor. In this sense, this review paper surveys the main existing applications of agricultural robotic systems for the execution of land preparation before planting, sowing, planting, plant treatment, harvesting, yield estimation and phenotyping. In general, all robots were evaluated according to the following criteria: its locomotion system, what is the final application, if it has sensors, robotic arm and/or computer vision algorithm, what is its development stage and which country and continent they belong. After evaluating all similar characteristics, to expose the research trends, common pitfalls and the characteristics that hinder commercial development, and discover which countries are investing into Research and Development (R&D) in these technologies for the future, four major areas that need future research work for enhancing the state of the art in smart agriculture were highlighted: locomotion systems, sensors, computer vision algorithms and communication technologies. The results of this research suggest that the investment in agricultural robotic systems allows to achieve short—harvest monitoring—and long-term objectives—yield estimation.
Collapse
|
11
|
Williams H, Ting C, Nejati M, Jones MH, Penhall N, Lim J, Seabright M, Bell J, Ahn HS, Scarfe A, Duke M, MacDonald B. Improvements to and large‐scale evaluation of a robotic kiwifruit harvester. J FIELD ROBOT 2019. [DOI: 10.1002/rob.21890] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023]
Affiliation(s)
- Henry Williams
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | - Canaan Ting
- School of EngineeringUniversity of WaikatoHamilton New Zealand
| | - Mahla Nejati
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | - Mark Hedley Jones
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | - Nicky Penhall
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | - JongYoon Lim
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | | | - Jamie Bell
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | - Ho Seok Ahn
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| | | | - Mike Duke
- School of EngineeringUniversity of WaikatoHamilton New Zealand
| | - Bruce MacDonald
- Centre for Automation and Robotic Engineering ScienceUniversity of AucklandAuckland New Zealand
| |
Collapse
|