1
|
A Visual Predictive Control Framework for Robust and Constrained Multi-Agent Formation Control. J INTELL ROBOT SYST 2022. [DOI: 10.1007/s10846-022-01674-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
2
|
|
3
|
Qiu Z, Hu S, Liang X. Disturbance observer based adaptive model predictive control for uncalibrated visual servoing in constrained environments. ISA TRANSACTIONS 2020; 106:40-50. [PMID: 32900474 DOI: 10.1016/j.isatra.2020.06.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/02/2018] [Revised: 09/07/2019] [Accepted: 06/20/2020] [Indexed: 06/11/2023]
Abstract
This paper presents an adaptive model predictive control (MPC) method based on disturbance observer (DOB) to improve the disturbance rejection performance of the image-based visual servoing (IBVS) system. The proposed control method is developed based on the depth-independent interaction matrix, which can simultaneously handle unknown camera intrinsic and extrinsic parameters, unknown depth parameters, system constraints, as well as external disturbances. The proposed control scheme includes two parts which are the feedback regulation part based on the adaptive MPC and the feedforward compensation part based on the modified DOB. Unlike the traditional DOB that is based on the fixed nominal plant model, the modified DOB here is based on the estimated plant model. The adaptive MPC controller consists of an iterative identification algorithm, which not only can provide the model parameters for both the controller and the modified DOB, but also can be used to control plant dynamics and to minimize the effects of DOB. Simulations for both the eye-in-hand and eye-to-hand camera configurations are conducted to illustrate the effectiveness of the proposed method.
Collapse
Affiliation(s)
- Zhoujingzi Qiu
- School of Aeronautics and Astronautics, Sun Yat-sen University, Guangzhou, China.
| | - Shiqiang Hu
- School of Aeronautics and Astronautics, Shanghai Jiao Tong University, Shanghai, China.
| | - Xinwu Liang
- School of Aeronautics and Astronautics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
4
|
Norton JC, Slawinski PR, Lay HS, Martin JW, Cox BF, Cummins G, Desmulliez MP, Clutton RE, Obstein KL, Cochran S, Valdastri P. Intelligent magnetic manipulation for gastrointestinal ultrasound. Sci Robot 2019; 4:eaav7725. [PMID: 31380501 PMCID: PMC6677276 DOI: 10.1126/scirobotics.aav7725] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Diagnostic endoscopy in the gastrointestinal tract has remained largely unchanged for decades and is limited to the visualization of the tissue surface, the collection of biopsy samples for diagnoses, and minor interventions such as clipping or tissue removal. In this work, we present the autonomous servoing of a magnetic capsule robot for in-situ, subsurface diagnostics of microanatomy. We investigated and showed the feasibility of closed-loop magnetic control using digitized microultrasound (μUS) feedback; this is crucial for obtaining robust imaging in an unknown and unconstrained environment. We demonstrated the functionality of an autonomous servoing algorithm that uses μUS feedback, both on benchtop trials as well as in-vivo in a porcine model. We have validated this magnetic-μUS servoing in instances of autonomous linear probe motion and were able to locate markers in an agar phantom with 1.0 ± 0.9 mm position accuracy using a fusion of robot localization and μUS image information. This work demonstrates the feasibility of closed-loop robotic μUS imaging in the bowel without the need for either a rigid physical link between the transducer and extracorporeal tools or complex manual manipulation.
Collapse
Affiliation(s)
| | | | | | | | | | | | | | | | - Keith L. Obstein
- STORM Lab USA, Vanderbilt University, Nashville, USA
- Vanderbilt University Medical Center, Nashville, USA
| | - Sandy Cochran
- University of Glasgow, School of Mechanical Engineering, Glasgow, UK
| | | |
Collapse
|
5
|
Lediju Bell MA, Shubert J. Photoacoustic-based visual servoing of a needle tip. Sci Rep 2018; 8:15519. [PMID: 30341371 PMCID: PMC6195562 DOI: 10.1038/s41598-018-33931-9] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Accepted: 10/08/2018] [Indexed: 12/15/2022] Open
Abstract
In intraoperative settings, the presence of acoustic clutter and reflection artifacts from metallic surgical tools often reduces the effectiveness of ultrasound imaging and complicates the localization of surgical tool tips. We propose an alternative approach for tool tracking and navigation in these challenging acoustic environments by augmenting ultrasound systems with a light source (to perform photoacoustic imaging) and a robot (to autonomously and robustly follow a surgical tool regardless of the tissue medium). The robotically controlled ultrasound probe continuously visualizes the location of the tool tip by segmenting and tracking photoacoustic signals generated from an optical fiber inside the tool. System validation in the presence of fat, muscle, brain, skull, and liver tissue with and without the presence of an additional clutter layer resulted in mean signal tracking errors <2 mm, mean probe centering errors <1 mm, and successful recovery from ultrasound perturbations, representing either patient motion or switching from photoacoustic images to ultrasound images to search for a target of interest. A detailed analysis of channel SNR in controlled experiments with and without significant acoustic clutter revealed that the detection of a needle tip is possible with photoacoustic imaging, particularly in cases where ultrasound imaging traditionally fails. Results show promise for guiding surgeries and procedures in acoustically challenging environments with this novel robotic and photoacoustic system combination.
Collapse
Affiliation(s)
- Muyinatu A Lediju Bell
- Johns Hopkins University, Department of Electrical and Computer Engineering, Baltimore, MD, 21218, USA. .,Johns Hopkins University, Department of Biomedical Engineering, Baltimore, MD, 21218, USA. .,Johns Hopkins University, Department of Computer Science, Baltimore, MD, 21218, USA.
| | - Joshua Shubert
- Johns Hopkins University, Department of Electrical and Computer Engineering, Baltimore, MD, 21218, USA
| |
Collapse
|
6
|
Nadeau C, Krupa A, Petr J, Barillot C. Moments-Based Ultrasound Visual Servoing: From a Mono- to Multiplane Approach. IEEE T ROBOT 2016. [DOI: 10.1109/tro.2016.2604482] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
7
|
Şen HT, Cheng A, Ding K, Boctor E, Wong J, Iordachita I, Kazanzides P. Cooperative Control with Ultrasound Guidance for Radiation Therapy. Front Robot AI 2016. [DOI: 10.3389/frobt.2016.00049] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022] Open
|
8
|
|
9
|
Nadeau C, Krupa A. Intensity-Based Ultrasound Visual Servoing: Modeling and Validation With 2-D and 3-D Probes. IEEE T ROBOT 2013. [DOI: 10.1109/tro.2013.2256690] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
10
|
Priester AM, Natarajan S, Culjat MO. Robotic ultrasound systems in medicine. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2013; 60:507-523. [PMID: 23475917 DOI: 10.1109/tuffc.2013.2593] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
Robots ultrasound (RUS) can be defined as the combination of ultrasound imaging with a robotic system in medical interventions. With their potential for high precision, dexterity, and repeatability, robots are often uniquely suited for ultrasound integration. Although the field is relatively young, it has already generated a multitude of robotic systems for application in dozens of medical procedures. This paper reviews the robotic ultrasound systems that have been developed over the past two decades and describes their potential impact on modern medicine. The RUS projects reviewed include extracorporeal devices, needle guidance systems, and intraoperative systems.
Collapse
Affiliation(s)
- Alan M Priester
- Biomedical Engineering Interdepartmental Program and the Center for Advanced Surgical and Interventional Technology, University of California, Los Angeles, Los Angeles, CA, USA.
| | | | | |
Collapse
|
11
|
Moustris GP, Hiridis SC, Deliparaschos KM, Konstantinidis KM. Evolution of autonomous and semi-autonomous robotic surgical systems: a review of the literature. Int J Med Robot 2011; 7:375-92. [DOI: 10.1002/rcs.408] [Citation(s) in RCA: 199] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/12/2011] [Indexed: 12/25/2022]
|
12
|
Mebarki R, Krupa A, Chaumette F. 2-D Ultrasound Probe Complete Guidance by Visual Servoing Using Image Moments. IEEE T ROBOT 2010. [DOI: 10.1109/tro.2010.2042533] [Citation(s) in RCA: 62] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|