Ultraviolet photoacoustic remote sensing microscopy provides label-free optical absorption contrast comparable to hematoxylin staining. This has been combined with 266 nm optical scattering microscopy offering eosin-like contrast. Here, we use unsupervised deep learning-based style transfer using the CycleGAN approach to render these pseudo-colored virtual histological images in a realistic stain style comparable to the H&E gold standard in unstained human and murine tissue specimens. A multi-pathologist diagnostic concordance study found a sensitivity of 89%, specificity of 91%, and accuracy of 90%. A blinded subjective stain quality survey suggested virtual histology was preferred over frozen sections at the 95% confidence level.
Currently, there is an inability to obtain fast realistic label-free virtual histopathological images of tissues. We previously introduced ultraviolet photoacoustic remote sensing microscopy as a method to obtain virtual hematoxylin contrast albeit without the ability to obtain virtual eosin contrast. By utilizing UV scattering as a high-resolution eosin channel we are able to produce complete H&E-like virtual histology of unstained human breast lumpectomy specimen sections. By further leveraging a novel colormap matching algorithm with this UV scattering, we generate H&E-like output that is shown to have strong concordance with true H&E-stained adjacent sections, showing promising diagnostic utility.
Following resection of cancerous tissues, specimens are excised from the surgical margins to be examined post-operatively for the presence of residual cancer cells. Hematoxylin and eosin (H&E) staining is the gold standard of histopathological assessment. Ultraviolet photoacoustic microscopy (UV-PARS), combined with scattering microscopy, provides virtual nuclei and cytoplasm contrast similar to H&E staining. A generative adversarial network (GAN) deep learning approach, specifically a CycleGAN, was used to perform style transfer to improve the histological realism of UV-PARS generated images. Post-CycleGAN images are easier for a pathologist to examine and can be input into existing machine learning pipelines for H&E-stained images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.