Gershov S, Braunold D, Spektor R, Ioscovich A, Raz A, Laufer S. Automating medical simulations.
J Biomed Inform 2023;
144:104446. [PMID:
37467836 DOI:
10.1016/j.jbi.2023.104446]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Revised: 07/08/2023] [Accepted: 07/16/2023] [Indexed: 07/21/2023]
Abstract
OBJECTIVE
This study aims to explore speech as an alternative modality for human activity recognition (HAR) in medical settings. While current HAR technologies rely on video and sensory modalities, they are often unsuitable for the medical environment due to interference from medical personnel, privacy concerns, and environmental limitations. Therefore, we propose an end-to-end, fully automatic objective checklist validation framework that utilizes medical personnel's uttered speech to recognize and document the executed actions in a checklist format.
METHODS
Our framework records, processes, and analyzes medical personnel's speech to extract valuable information about performed actions. This information is then used to fill the corresponding rubrics in the checklist automatically.
RESULTS
Our approach to activity recognition outperformed the online expert examiner, achieving an F1 score of 0.869 on verbal tasks and an ICC score of 0.822 with an offline examiner. Furthermore, the framework successfully identified communication failures and medical errors made by physicians and nurses.
CONCLUSION
Implementing a speech-based framework in medical settings, such as the emergency room and operation room, holds promise for improving care delivery and enabling the development of automated assistive technologies in various medical domains. By leveraging speech as a modality for HAR, we can overcome the limitations of existing technologies and enhance workflow efficiency and patient safety.
Collapse