PurposeThe objective of this study is to evaluate the accuracy of an augmented reality (AR) system in improving guidance, accuracy, and visualization during the subxiphoidal approach for epicardial ablation.ApproachAn AR application was developed to project real-time needle trajectories and patient-specific 3D organs using the Hololens 2. Additionally, needle tracking was implemented to offer real-time feedback to the operator, facilitating needle navigation. The AR application was evaluated through three different experiments: examining overlay accuracy, assessing puncture accuracy, and performing pre-clinical evaluations on a phantom.ResultsThe results of the overlay accuracy assessment for the AR system yielded 2.36±2.04 mm. Additionally, the puncture accuracy utilizing the AR system yielded 1.02±2.41 mm. During the pre-clinical evaluation on the phantom, needle puncture with AR guidance showed 7.43±2.73 mm, whereas needle puncture without AR guidance showed 22.62±9.37 mm.ConclusionsOverall, the AR platform has the potential to enhance the accuracy of percutaneous epicardial access for mapping and ablation of cardiac arrhythmias, thereby reducing complications and improving patient outcomes. The significance of this study lies in the potential of AR guidance to enhance the accuracy and safety of percutaneous epicardial access.
Percutaneous epicardial access for epicardial ablation and mapping of cardiac arrhythmias is being performed more and more often. Unfortunately, complications such as injury to surrounding structures have been reported. Despite the current imaging techniques, it is still difficult to guarantee sufficient ablation accuracy. Head-Mounted-Display (HMD) Augmented Reality (AR) overlay and guidance has the potential to reduce the risk of complications. The objective of this study was to evaluate the accuracy and performance of an AR-guided epicardial puncture for catheter ablation of ventricular tachycardia. An AR software tool was designed to render real-time needle trajectories and 3D patient-specific organs. Registration of preoperative data is realized by attaching four AR patterns to the skin of the patient. Needle tracking is realized by attaching one AR pattern to the end of the needle’s base. The ideal trajectory through the pericardial space and patient-specific organs was planned and segmented on preoperative CT. The application’s accuracy was evaluated in a phantom study. Seven operators performed needle puncture with and without the use of the AR system. Placements errors were measured on postprocedural CT. With the use of the proposed AR-based guidance, post procedure CT revealed an error at the puncture site of 3.67±2.78 mm. At the epicardial interface, the error increased to 7.78±2.36 mm. The angle of the actual trajectory deviated on average 4.82±1.48◦ from the planned trajectory. The execution time was on average 34.0 ± 25.1 s, hence introducing no significant delay at an overall superior performance level compared to without AR-guided puncturing. The proposed AR platform has the potential to facilitate percutaneous epicardial access for epicardial ablation and mapping of cardiac arrhythmias by improving needle insertion accuracy.
Automation of systematic scoring of breast glandularity on CT thorax examinations performed for another clinical reason could aid in detecting postmenopausal women with increased breast cancer risk. We propose a novel method that combines automated deep learning based breast segmentation from CT thorax examinations with computation of breast glandularity based on radiodensity and volumetric breast density. Reasonable segmentation Dice scores were found as well as very strong correlation between the risk measures computed on the ground truth and with the proposed approach. Hence, the proposed method can offer reliable breast cancer risk measures with limited additional workload for the radiologist.
VT ablations could benefit from Dynamic 3D (4D) left ventricle (LV) visualization as road-map for anatomy-guided procedures. We developed a registration-based method that combines information of several cardiac phases to filter out noise and artifacts in low-dose 3D Rotational Angiography (3DRA) images. This also enables generation of accurate multi-phase surface models by semi-automatic segmentation (SAS). The method uses B-spline non-rigid inter-phase registration (IPR) and subsequent averaging of the registered 3DRA images of 4 cardiac phases, acquired with a slow atrial pacing protocol, and was validated on data from 5 porcine experiments. IPR parameter settings were optimized against manual delineations of the LVs using a composed similarity score (Q), dependent on DICE-coefficient, RMSDistance, Hausdorff (HD) and the percentage of inter-surface distances ≤3mm and ≤4mm. The latter are clinically acceptable error cut-off values. Validation was performed after SAS for varying voxel intensity thresholds (ISO), by comparison between models with and without prior use of IPR. Distances to the manual delineations at optimal ISO were reduced to ≤3mm for 95.6±2.7% and to ≤4mm for 97.1±2.0% of model surfaces. Improved quality was proven by significant mean Q-increase irrespective of ISO (7.6% at optimal ISO (95%CI 4.6-10.5,p<0.0001)). Quality improvement was more important at suboptimal ISO values. Significant (p<0.0001) differences were also noted in HD (-20.5%;95%CI -12.1%-- 29.0%), RMSD (-28.3%;95%CI -21.7%--35.0%) and DICE (1.7%;95%CI 0.9%-2.6%). Generating 4D LV models proved feasible, with sufficient accuracy for clinical applications, opening the perspective of more accurate overlay and guidance during ablation in locations with high degrees of movement.
Cardiac rotational angiography (RA) is well suited for 3-D cardiac imaging during catheter based interventions
but remained limited to static images or was characterized by high dose patient radiation dose. We present a
new prospective imaging technique that is capable of imaging the dynamics of the cardiac cavities in a single
C-arm run during the intervention with a relatively low dose.
By combining slow atrial pacing to obtain a stable heart rhythm and a single C-arm rotation with imaging
at a regular imaging interval, a prospective 4DRA is established. Pacing interval and imaging framerate can be
adapted such that a single cardiac phase is imaged multiple times and a motion free state is imaged from different
equiangular positions. A practical implementation of this technique was realized in which the cardiac cavities
are imaged while pacing at 105 bpm (574 msec) and imaging at approximately 15 fps. A number of animal
experiments were conducted in which the technique was applied and MR imaging was performed subsequently.
Quantitative comparison was made by manual contouring of the left ventricle in the RA and MR images of both
end-systolic and end-diastolic phases.
Reconstructed images of the individual cardiac phases showed all four chambers and important vessels in
spite of substantial image noise. 4DRA and MR absolute surface distance errors amounted to 2:8 ± 0:7 mm,
which is acceptable. Further, no systematic difference could be identified. Finally, it is expected that the effective
dose of a clinical protocol with 381 images will be lower than the current retrospective gated RA protocols.
KEYWORDS: 3D modeling, Angiography, 3D image processing, Data modeling, Statistical analysis, Detection and tracking algorithms, 3D acquisition, Veins, Atrial fibrillation, Statistical modeling
Catheter based radio-frequency ablation is used as an invasive treatment of atrial fibrillation. This procedure
is often guided by the use of 3D anatomical models obtained from CT, MRI or rotational angiography. During
the intervention the operator accurately guides the catheter to prespecified target ablation lines. The planning
stage, however, can be time consuming and operator dependent which is suboptimal both from a cost and health
perspective. Therefore, we present a novel statistical model-based algorithm for locating ablation targets from
3D rotational angiography images.
Based on a training data set of 20 patients, consisting of 3D rotational angiography images with 30 manually
indicated ablation points, a statistical local appearance and shape model is built. The local appearance model is
based on local image descriptors to capture the intensity patterns around each ablation point. The local shape
model is constructed by embedding the ablation points in an undirected graph and imposing that each ablation
point only interacts with its neighbors. Identifying the ablation points on a new 3D rotational angiography image
is performed by proposing a set of possible candidate locations for each ablation point, as such, converting the
problem into a labeling problem.
The algorithm is validated using a leave-one-out-approach on the training data set, by computing the distance
between the ablation lines obtained by the algorithm and the manually identified ablation points. The distance
error is equal to 3.8±2.9 mm. As ablation lesion size is around 5-7 mm, automated planning of ablation targets
by the presented approach is sufficiently accurate.
Introducing a patient specific model in an augmented reality
environment for cardiac thermo-ablation procedures, requires a
calibration strategy that is sufficiently accurate and easily applicable without compromising the image quality that is conventionally expected. We present a two-step calibration method and registration strategy which can satisfy these requirements. Relying only on catheter electrode correspondences and the knowledge of the rotation between the fluoroscopes, the method retrieves both the parameters of fluoroscopic devices, including non-linear distortion effects, and the electrode positions. Registration can subsequently be performed by visually matching the pre-operative model to the fluoroscopic images inside the augmented reality framework.
Simulations under real life conditions and validation on real images
show an accuracy of 5 pixels which is equivalent to <2 mm in world
coordinates. Validation experiments on real images and with
calibration jig as gold standard show a mean reconstruction accuracy
of 1,33 mm resulting and a mean image plane error of 5.48 pixels.
The treatment of atrial tachycardia by radio-frequency ablation is a complex and minimally invasive procedure. In most cases the surgeon uses fluoroscopic imaging to guide catheters into the atria. After recording activation potentials from the electrodes on the catheter, which has to be done for different catheter positions, the physiologist has to fuse both the activation times derived from the potentials with the fluoroscopic images and extract from these a 3D anatomical model of the atrium. This model will provide him with the necessary information to locate the ablation regions.
To alleviate the problem of mentally reconstructing these different sources of information, we propose a virtual environment that has the ability to visualize the electrodes information onto a patient specific model of the atria. This 3D atrium surface model is derived from pre-operatively taken MR-images. Within the system this model is visualized in 3 different ways: two views correspond to the 2 fluoroscopes images, which are shown registred in the background while the third one can be freely manipulated by the physiologist. The system allows to annotate measurements onto the 3D model. Since the heart is not a static organ, tools are provided to modify previous annotations interactively. The information contained in the measurements can than be dispersed across the heart after extrapolation and interpolation and subsequently visualized by color coding the surface model.
Preliminary clinical evaluation on 30 patients indicates that the combined representation of the activation times and the heart model provides a thorough and more accurate insight into the possible causes and solutions to the tachycardia than would be obtained using solely the fluoroscopes images and mental reconstruction.
Unlike other tachycardia visualization software, our approach starts with a patient specific surface model which in itself provides extra insight into the problem. Furthermore it can be used very interactively by the physiologist as a kind of 3D sketchbook where he can enter, delete, ... different measurements, tissue types. Finally, the system can visualize at any stage of the surgery a model containing all information at hand.
In this paper we present a system to represent electrocardiographic information that allows the physiologist to mark measurements which can than be visualized on a patient specific atrium model by color coding. First clinical evaluation indicates that this approach offers a considerable amount of added value.
In the field of image-guided surgery, registration of pre- and intra-operative images is an important, yet sometimes cumbersome procedure. Visual matching, consisting of a manual alignment of such images can improve the efficiency of registration. In this paper we investigate the accuracy of a visual matching system. A number of tests, including mono and stereo visualization of real and virtual objects, was set up. Seven test subjects were asked to register pre- and intra-operative images while accuracy and registration time were measured. The error analysis led to the conclusion that one can accurately match a virtual phantom with a real one in directions perpendicular to the viewing direction. The depth accuracy though is insufficient for registration purposes, even when using stereo perception. We conclude that visual matching can be complementary to more automatic registration methods or even a valuable alternative in applications where the viewing directions are nearly perpendicular as in bi-plane fluoroscopy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.