Recent studies have reported that deep learning techniques could achieve high performance in medical image analysis such as computer-aided diagnosis (CADx). However, there is a limitation in interpreting the diagnostic decisions of deep learning due to the black-box nature. To increase confidence in the diagnostic decisions of deep learning, it is necessary to develop a deep neural network with the interpretable structure which could provide a reasonable explanation of diagnostic decisions. In this study, a novel deep neural network has been devised to provide visual evidence of the diagnostic decisions of CADx. The proposed deep network is designed to include a visual interpreter which could provide important areas as the visual evidence of the diagnostic decision in the deep neural network. Based on the observation that the radiologists usually make a diagnostic decision based on the lesion characteristics (the margin and the shape of masses), the visual interpreter provides visual evidence related with the margin and the shape, respectively. To verify the effectiveness of the proposed method, experiments were conducted on mammogram datasets. Experimental results show that the proposed method could provide more important areas as the visual evidence compared with the conventional visualization method. These results imply that the proposed visual interpretation method could be a promising approach to overcome the current limitation of the deep learning for CADx.
In this study, a novel computer aided diagnosis (CADx) framework is devised to investigate interpretability for classifying breast masses. Recently, a deep learning technology has been successfully applied to medical image analysis including CADx. Existing deep learning based CADx approaches, however, have a limitation in explaining the diagnostic decision. In real clinical practice, clinical decisions could be made with reasonable explanation. So current deep learning approaches in CADx are limited in real world deployment. In this paper, we investigate interpretability in CADx with the proposed interpretable CADx (ICADx) framework. The proposed framework is devised with a generative adversarial network, which consists of interpretable diagnosis network and synthetic lesion generative network to learn the relationship between malignancy and a standardized description (BI-RADS). The lesion generative network and the interpretable diagnosis network compete in an adversarial learning so that the two networks are improved. The effectiveness of the proposed method was validated on public mammogram database. Experimental results showed that the proposed ICADx framework could provide the interpretability of mass as well as mass classification. It was mainly attributed to the fact that the proposed method was effectively trained to find the relationship between malignancy and interpretations via the adversarial learning. These results imply that the proposed ICADx framework could be a promising approach to develop the CADx system.
In this paper, new mass features based on inter-view similarity in DBT projection views are proposed for classifying masses, aiming to effectively reducing false-positives (FPs). The proposed features are focused on utilizing inter-view information in projection views. The FPs induced by overlapping tissues of different depth could be observed differently between projection views, while masses could appear similar. To utilize the observation, the inter-view similarity measure is developed by utilizing the normalized cross-correlation between ROIs in projection views. In the analysis of inter-view similarities of masses and FPs, it is showed that inter-view similarities of FPs are lower than those of masses. To that end, new features are proposed to encode aforementioned difference of inter-view similarities between FPs and masses. Experimental results show that the proposed features can improve the mass classification performance in projection views in terms of the area under the ROC curve.
KEYWORDS: Digital breast tomosynthesis, Mammography, 3D image processing, Computer aided design, Information fusion, CAD systems, Digital mammography, Computer aided diagnosis and therapy, 3D image reconstruction
In this study, a novel mass detection framework that utilizes the information from synthetic mammograms has been developed for detecting masses in digital breast tomosynthesis (DBT). In clinical study, it is demonstrated that the combination of DBT and full field digital mammography (FFDM) increases the reader performance. To reduce the radiation dose in this approach, synthetic mammogram has been developed in previous researches and it is demonstrated that synthetic mammogram can alternate the FFDM when it is used with DBT. In this study, we investigate the feasibility of the combined approach of DBT and synthetic mammogram in point of computer-aided detection (CAD). As a synthetic mammogram, two-dimensional image was generated by adopting conspicuous voxels of three-dimensional DBT volume in our study. The mass likelihood scores estimated for each mass candidates in synthetic mammogram and DBT are merged to differentiate masses and false positives (FPs) in combined approach. We compared the performance of detecting masses in the proposed combined approach and DBT alone. A clinical data set of 196 DBT volumes was used to evaluate the different detection schemes. The combined approach achieved sensitivity of 80% and 89% with 1.16 and 2.37 FPs per DBT volume. The DBT alone approach achieved same sensitivities with 1.61 and 3.46 FPs per DBT volume. Experimental results show that statistically significant improvement (p = 0.002) is achieved in combined approach compared to DBT alone. These results imply that the information fusion of synthetic mammogram and DBT is a promising approach to detect masses in DBT.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.