UP-137 A synchronous motion-tracking and video-capture system for objective assessment and training in ureteroscopy
Thursday June 27, 2019 from
TBD
Presenter

Jessica Trac, Canada

Medical Student

University of Toronto

Abstract

A synchronous motion-tracking and video-capture system for objective assessment and training in ureteroscopy

Jessica Trac1, Brian Carrillo5, Monica Farcas2,3,4.

1Department of Medicine, University of Toronto, Toronto, ON, Canada; 2Department of Surgery, Division of Urology, St. Michael’s Hospital, Toronto, ON, Canada; 3Li Ka Shing Knowledge Institute, St. Michael's Hospital, Toronto, ON, Canada; 4Department of Surgery, University of Toronto, Toronto, ON, Canada; 5N/A, N/A, Toronto, ON, Canada

Introduction: Hand/instrument motion-tracking in surgical simulation provides valuable data to improve psychomotor skills, and can serve as a more formative evaluation tool. Although motion analysis has been well studied in laparoscopic surgery, it has been poorly studied in endoscopic surgery. There are essentially no studies looking at motion tracking for flexible ureteroscopy (fURS), a surgical procedure that requires significant hand dexterity. Our goal was to design an open-source, synchronized motion-tracking and video-capture system for flexible ureteroscopy. The aim is to provide trainee feedback and to collect metrics for use in objective skills assessment/examinations.

Methods: Position and orientation data of the ureteroscope handle and lever (used to manipulate the tip) was collected with a motion tracking system (PolhemusTM), off-the-shelf inertial measurement units (IMUs) and optical sensors. Video data of the surgeon’s hands was captured with a Raspberry Pi camera. Video data of the scope view was collected from the video tower with an off-the-shelf USB video grabber. Open source Python software was written to control and integrate the sensors and cameras with a Raspberry Pi 4.

Results: A preliminary prototype of the system was assembled with the PolhemusTM sensor, IMUs and Raspberry Pi. A 10-minute trial demonstrated successful, synchronized data collection of the position and orientation of the instrument handle and lever, and video data of the hands. Average CPU utilization went from an 8% baseline to 33% during data collection.

Conclusions: We are building an open-source data collection system capable of gathering synchronized motion-tracking and video data in fURS. The data pool can be utilized by surgeons and engineers to improve and standardize objective assessment and simulation training for endoscopic surgery. Our next steps include integration of optical sensors for insertion point tracking, and a sensor for scope tip tracking.


© 2022 CUA 74th Annual Meeting