1
|
Jin XB, Chen W, Ma HJ, Kong JL, Su TL, Bai YT. Parameter-Free State Estimation Based on Kalman Filter with Attention Learning for GPS Tracking in Autonomous Driving System. SENSORS (BASEL, SWITZERLAND) 2023; 23:8650. [PMID: 37896741 PMCID: PMC10610770 DOI: 10.3390/s23208650] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/18/2023] [Revised: 10/15/2023] [Accepted: 10/16/2023] [Indexed: 10/29/2023]
Abstract
GPS-based maneuvering target localization and tracking is a crucial aspect of autonomous driving and is widely used in navigation, transportation, autonomous vehicles, and other fields.The classical tracking approach employs a Kalman filter with precise system parameters to estimate the state. However, it is difficult to model their uncertainty because of the complex motion of maneuvering targets and the unknown sensor characteristics. Furthermore, GPS data often involve unknown color noise, making it challenging to obtain accurate system parameters, which can degrade the performance of the classical methods. To address these issues, we present a state estimation method based on the Kalman filter that does not require predefined parameters but instead uses attention learning. We use a transformer encoder with a long short-term memory (LSTM) network to extract dynamic characteristics, and estimate the system model parameters online using the expectation maximization (EM) algorithm, based on the output of the attention learning module. Finally, the Kalman filter computes the dynamic state estimates using the parameters of the learned system, dynamics, and measurement characteristics. Based on GPS simulation data and the Geolife Beijing vehicle GPS trajectory dataset, the experimental results demonstrated that our method outperformed classical and pure model-free network estimation approaches in estimation accuracy, providing an effective solution for practical maneuvering-target tracking applications.
Collapse
Affiliation(s)
- Xue-Bo Jin
- Artificial Intelligence College, Beijing Technology and Business University, Beijing 100048, China; (X.-B.J.); (W.C.); (J.-L.K.); (T.-L.S.); (Y.-T.B.)
- China Light Industry Key Laboratory of Industrial Internet and Big Data, Beijing Technology and Business University, Beijing 100048, China
| | - Wei Chen
- Artificial Intelligence College, Beijing Technology and Business University, Beijing 100048, China; (X.-B.J.); (W.C.); (J.-L.K.); (T.-L.S.); (Y.-T.B.)
- China Light Industry Key Laboratory of Industrial Internet and Big Data, Beijing Technology and Business University, Beijing 100048, China
| | - Hui-Jun Ma
- Artificial Intelligence College, Beijing Technology and Business University, Beijing 100048, China; (X.-B.J.); (W.C.); (J.-L.K.); (T.-L.S.); (Y.-T.B.)
- China Light Industry Key Laboratory of Industrial Internet and Big Data, Beijing Technology and Business University, Beijing 100048, China
| | - Jian-Lei Kong
- Artificial Intelligence College, Beijing Technology and Business University, Beijing 100048, China; (X.-B.J.); (W.C.); (J.-L.K.); (T.-L.S.); (Y.-T.B.)
- China Light Industry Key Laboratory of Industrial Internet and Big Data, Beijing Technology and Business University, Beijing 100048, China
| | - Ting-Li Su
- Artificial Intelligence College, Beijing Technology and Business University, Beijing 100048, China; (X.-B.J.); (W.C.); (J.-L.K.); (T.-L.S.); (Y.-T.B.)
- China Light Industry Key Laboratory of Industrial Internet and Big Data, Beijing Technology and Business University, Beijing 100048, China
| | - Yu-Ting Bai
- Artificial Intelligence College, Beijing Technology and Business University, Beijing 100048, China; (X.-B.J.); (W.C.); (J.-L.K.); (T.-L.S.); (Y.-T.B.)
- China Light Industry Key Laboratory of Industrial Internet and Big Data, Beijing Technology and Business University, Beijing 100048, China
| |
Collapse
|
2
|
Masoumi N, Rivaz H, Hacihaliloglu I, Ahmad MO, Reinertsen I, Xiao Y. The Big Bang of Deep Learning in Ultrasound-Guided Surgery: A Review. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2023; 70:909-919. [PMID: 37028313 DOI: 10.1109/tuffc.2023.3255843] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Ultrasound (US) imaging is a paramount modality in many image-guided surgeries and percutaneous interventions, thanks to its high portability, temporal resolution, and cost-efficiency. However, due to its imaging principles, the US is often noisy and difficult to interpret. Appropriate image processing can greatly enhance the applicability of the imaging modality in clinical practice. Compared with the classic iterative optimization and machine learning (ML) approach, deep learning (DL) algorithms have shown great performance in terms of accuracy and efficiency for US processing. In this work, we conduct a comprehensive review on deep-learning algorithms in the applications of US-guided interventions, summarize the current trends, and suggest future directions on the topic.
Collapse
|
4
|
Baker C, Xochicale M, Lin FY, Mathews S, Joubert F, Shakir DI, Miles R, Mosse CA, Zhao T, Liang W, Kunpalin Y, Dromey B, Mistry T, Sebire NJ, Zhang E, Ourselin S, Beard PC, David AL, Desjardins AE, Vercauteren T, Xia W. Intraoperative Needle Tip Tracking with an Integrated Fibre-Optic Ultrasound Sensor. SENSORS (BASEL, SWITZERLAND) 2022; 22:9035. [PMID: 36501738 PMCID: PMC9739176 DOI: 10.3390/s22239035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 11/15/2022] [Accepted: 11/16/2022] [Indexed: 06/17/2023]
Abstract
Ultrasound is an essential tool for guidance of many minimally-invasive surgical and interventional procedures, where accurate placement of the interventional device is critical to avoid adverse events. Needle insertion procedures for anaesthesia, fetal medicine and tumour biopsy are commonly ultrasound-guided, and misplacement of the needle may lead to complications such as nerve damage, organ injury or pregnancy loss. Clear visibility of the needle tip is therefore critical, but visibility is often precluded by tissue heterogeneities or specular reflections from the needle shaft. This paper presents the in vitro and ex vivo accuracy of a new, real-time, ultrasound needle tip tracking system for guidance of fetal interventions. A fibre-optic, Fabry-Pérot interferometer hydrophone is integrated into an intraoperative needle and used to localise the needle tip within a handheld ultrasound field. While previous, related work has been based on research ultrasound systems with bespoke transmission sequences, the new system-developed under the ISO 13485 Medical Devices quality standard-operates as an adjunct to a commercial ultrasound imaging system and therefore provides the image quality expected in the clinic, superimposing a cross-hair onto the ultrasound image at the needle tip position. Tracking accuracy was determined by translating the needle tip to 356 known positions in the ultrasound field of view in a tank of water, and by comparison to manual labelling of the the position of the needle in B-mode US images during an insertion into an ex vivo phantom. In water, the mean distance between tracked and true positions was 0.7 ± 0.4 mm with a mean repeatability of 0.3 ± 0.2 mm. In the tissue phantom, the mean distance between tracked and labelled positions was 1.1 ± 0.7 mm. Tracking performance was found to be independent of needle angle. The study demonstrates the performance and clinical compatibility of ultrasound needle tracking, an essential step towards a first-in-human study.
Collapse
Affiliation(s)
- Christian Baker
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Miguel Xochicale
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Fang-Yu Lin
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Sunish Mathews
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Francois Joubert
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Dzhoshkun I. Shakir
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Richard Miles
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Charles A. Mosse
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Tianrui Zhao
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Weidong Liang
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Yada Kunpalin
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
- Elizabeth Garrett Anderson Institute for Women’s Health, University College London, 74 Huntley Street, London WC1E 6AU, UK
| | - Brian Dromey
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
- Elizabeth Garrett Anderson Institute for Women’s Health, University College London, 74 Huntley Street, London WC1E 6AU, UK
| | - Talisa Mistry
- NIHR Great Ormond Street BRC and Institute of Child Health, University College London, 30 Guilford Street, London WC1N 1EH, UK
| | - Neil J. Sebire
- NIHR Great Ormond Street BRC and Institute of Child Health, University College London, 30 Guilford Street, London WC1N 1EH, UK
| | - Edward Zhang
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Sebastien Ourselin
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Paul C. Beard
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Anna L. David
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
- Elizabeth Garrett Anderson Institute for Women’s Health, University College London, 74 Huntley Street, London WC1E 6AU, UK
| | - Adrien E. Desjardins
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Tom Vercauteren
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Wenfeng Xia
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| |
Collapse
|