Proceedings Article | 10 April 2023
KEYWORDS: Radiology, Education and training, Data modeling, Machine learning, Performance modeling, Deep learning, Neural networks, Statistical modeling, Random forests, Visual process modeling
Follow up imaging recommendations are often made in radiology reports. Not adhering to these recommendations can result in treatment delays, negative patient outcomes, additional medical issues, excessive testing, financial loss, and potential legal repercussions. In this paper, we present a generative-discriminative deep learning approach to classify radiology reports based on the presence of follow up recommendations. The generative model aims to produce a distributed representation of words in radiology reports. The discriminative model aims to extract local features that can be used to identify follow up recommendations. Three radiology report datasets were collected: one (n = 41417) with examinations extracted from the information system by regular expression operations, one (n= 14993) with examinations performed between January 1, 2015 to January 10, 2015 and annotated by three radiologists, and one (n = 5093) annotated by our radiologists at the time of dictation. Classification performance of the proposed hybrid model was compared with four traditional classification algorithms (random forest, logistic regression, naive Bayes and support vector machine (SVM)) and three neural network-based models (fastText, Convolutional neural network (CNN), Graph convolutional network (GCN)). A visualization algorithm was used to interpret the classification results of the hybrid model. Precision, recall, accuracy, and F1 scores were calculated for all models on the test set . Experimental results show that the hybrid model had a statistically significant higher F1 score (0.942 for Hybrid-random and 0.951 for Hybrid-report) than did four traditional machine learning algorithms (random forest: 0.914, logistic regression: 0.838, naive Bayes: 0.808 and SVM: 0.890) and the best-performing neural network-based model (fastText: 0.918). Our model can accurately identify radiology reports that contain follow-up recommendations. It can also automate detection of follow-up recommendation, thus streamlining workflows in a clinical setting and ensuring timely and appropriate patient care.