Paper
24 March 2016 Variability amongst radiographers in the categorization of clinical acceptability for digital trauma radiography
Author Affiliations +
Abstract
Introduction: Radiographers evaluate anatomical structures to judge clinical acceptability of a radiograph. Whether a radiograph is deemed acceptable for diagnosis or not depends on the individual decision of the radiographer. Individual decisions cause variation in the accepted image quality. To minimise these variations definitions of acceptability, such as in RadLex, were developed. On which criteria radiographers attribute a RadLex categories to radiographs is unknown. Insight into these criteria helps to further optimise definitions and reduce variability in acceptance between radiographers. Therefore, this work aims the evaluation of the correlation between the RadLex classification and the evaluation of anatomical structures, using a Visual Grading Analysis (VGA) Methods: Four radiographers evaluated the visibility of five anatomical structures of 25 lateral cervical spine radiographs on a secondary class display with a VGA. They judged clinical acceptability of each radiograph using RadLex. Relations between VGAS and RadLex category were analysed with Kendall’s Tau correlation and Nagelkerke pseudo-R². Results: The overall VGA score (VGAS) and the RadLex score correlate (rτ= 0.62, p<0.01, R2=0.72) strongly. The observers’ evaluation of contrast between bone, air (trachea) and soft tissue has low value in predicting (rτ=0.55, p<0.01, R2=0.03) the RadLex score. The reproduction of spinous processes (rτ=0.67, p<0.01, R2=0.31) and the evaluation of the exposure (rτ=0.65, p<0.01, R2=0.56) have a strong correlation with high predictive value for the RadLex score. Conclusion: RadLex scores and VGAS correlate positively, strongly and significantly. The predictive value of bony structures may support the use of these in the judgement of clinical acceptability. Considerable inter-observer variations in the VGAS within a certain RadLex category, suggest that observers use of observer specific cut-off values.
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Robin Decoster, Rachel Toomey, Dirk Smits, Harrie Mol, Filip Verhelle, and Marie-Louise Butler "Variability amongst radiographers in the categorization of clinical acceptability for digital trauma radiography", Proc. SPIE 9787, Medical Imaging 2016: Image Perception, Observer Performance, and Technology Assessment, 97871I (24 March 2016); https://doi.org/10.1117/12.2216487
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Radiography

Image quality

Diagnostics

Visualization

Visual analytics

Image processing

Medical imaging

Back to Top