Augmented reality (AR) image tracking may be used in AR-guided surgical applications for real-time guidance and quantitative feedback. With AR-guided applications allowing for broader accessibility compared to specialized systems used in traditional surgical image-guidance, we evaluated the measurement errors of monocular AR image tracking against current gold standard infrared optical and electromagnetic (EM) tracking. A measurement stylus was designed and 3D printed, allowing for monocular AR image tracking using a Logitech C920 camera, infrared optical tracking with Northern Digital Inc. (NDI) Vicra, and EM tracking with NDI Aurora through corresponding sensor attachments. A measurement phantom was also designed and 3D printed, consisting of 3 measurement planes with 81 measurement points in each plane, totaling 243 measurement points across a 16 cm x 16 cm x 18 cm measurement volume. Pivot calibration was performed using random sample consensus (RANSAC) sphere fitting to calculate the offsets between sensor attachments to stylus tip across each tracking system. Measurements of the stylus tip were collected across the measurement phantom for each tracking system. Each system’s fiducial registration error was quantified using the collected tip positions through rigid registration between the tracking system and the designed phantom points from CAD. Fiducial registration errors were 1.19 mm, 0.59 mm, and 0.51 mm for monocular AR, infrared optical, and EM tracking. Monocular AR image tracking presents a cost effective and accessible solution for surgical guidance applications. Errors close to 1 mm may be suitable for scenarios such as surgical simulators in competency-based education and AR-based planning.
Emilie Chamma, Jimmy Qiu, Liis Lindvere-Teene, Kristina Blackmore, Safa Majeed, Robert Weersink, Colleen Dickie, Anthony Griffin, Jay Wunder, Peter Ferguson, Ralph DaCosta
Standard clinical management of extremity soft tissue sarcomas includes surgery with radiation therapy. Wound complications (WCs) arising from treatment may occur due to bacterial infection and tissue breakdown. The ability to detect changes in these parameters during treatment may lead to earlier interventions that mitigate WCs. We describe the use of a new system composed of an autofluorescence imaging device and an optical three-dimensional tracking system to detect and coregister the presence of bacteria with radiation doses. The imaging device visualized erythema using white light and detected bacterial autofluorescence using 405-nm excitation light. Its position was tracked relative to the patient using IR reflective spheres and registration to the computed tomography coordinates. Image coregistration software was developed to spatially overlay radiation treatment plans and dose distributions on the white light and autofluorescence images of the surgical site. We describe the technology, its use in the operating room, and standard operating procedures, as well as demonstrate technical feasibility and safety intraoperatively. This new clinical tool may help identify patients at greater risk of developing WCs and investigate correlations between radiation dose, skin response, and changes in bacterial load as biomarkers associated with WCs.
A prototype mobile C-arm for cone-beam CT (CBCT) has been translated to a prospective clinical trial in head and neck
surgery. The flat-panel CBCT C-arm was developed in collaboration with Siemens Healthcare, and demonstrates both
sub-mm spatial resolution and soft-tissue visibility at low radiation dose (e.g., <1/5th of a typical diagnostic head CT).
CBCT images are available ~15 seconds after scan completion (~1 min acquisition) and reviewed at bedside using
custom 3D visualization software based on the open-source Image-Guided Surgery Toolkit (IGSTK). The CBCT C-arm
has been successfully deployed in 15 head and neck cases and streamlined into the surgical environment using human
factors engineering methods and expert feedback from surgeons, nurses, and anesthetists. Intraoperative imaging is
implemented in a manner that maintains operating field sterility, reduces image artifacts (e.g., carbon fiber OR table) and
minimizes radiation exposure. Image reviews conducted with surgical staff indicate bony detail and soft-tissue
visualization sufficient for intraoperative guidance, with additional artifact management (e.g., metal, scatter) promising
further improvements. Clinical trial deployment suggests a role for intraoperative CBCT in guiding complex head and
neck surgical tasks, including planning mandible and maxilla resection margins, guiding subcranial and endonasal
approaches to skull base tumours, and verifying maxillofacial reconstruction alignment. Ongoing translational research
into complimentary image-guidance subsystems include novel methods for real-time tool tracking, fusion of endoscopic
video and CBCT, and deformable registration of preoperative volumes and planning contours with intraoperative CBCT.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.