We present a virtual staining framework that can rapidly stain defocused autofluorescence images of label-free tissue, matching the performance of standard virtual staining models that use in-focus unlabeled images. We trained and blindly tested this deep learning-based framework using human lung tissue. Using coarsely-focused autofluorescence images acquired with 4× fewer focus points and 2× lower focusing precision, we achieved equivalent performance to the standard virtual staining that used finely-focused autofluorescence input images. We achieved a ~32% decrease in the total image acquisition time needed for virtual staining of a label-free whole-slide image, alongside a ~89% decrease in the autofocusing time.
We present a deep learning-based framework to virtually transfer images of H&E-stained tissue to other stain types using cascaded deep neural networks. This method, termed C-DNN, was trained in a cascaded manner: label-free autofluorescence images were fed to the first generator as input and transformed into H&E stained images. These virtually stained H&E images were then transformed into Periodic acid–Schiff (PAS) stain by the second generator. We trained and tested C-DNN on kidney needle-core biopsy tissue, and its output images showed better color accuracy and higher contrast on various histological features compared to other stain transfer models.
We present a stain-free, rapid, and automated viral plaque assay using deep learning and holography, which needs significantly less sample incubation time than traditional plaque assays. A portable and cost-effective lens-free imaging prototype was built to record the spatio-temporal features of the plaque-forming units (PFUs) during their growth, without the need for staining. Our system detected the first cell lysing events as early as 5 hours of incubation and achieved >90% PFU detection rate with 100% specificity in <20 hours, saving >24 hours compared to the traditional viral plaque assays that take ≥48 hours.
We present a high-throughput and automated system for the early detection and classification of bacterial colony-forming units (CFUs) using a thin-film transistor (TFT) image sensor. A lens-free imager was built using the TFT sensor with a ~7 cm2 field-of-view to collect the time-lapse images of bacterial colonies. Two trained neural networks were used to detect and classify the bacterial colonies based on their spatio-temporal features. Our system achieved an average CFU detection rate of 97.3% at 9 hours of incubation and an average CFU recovery rate of 91.6% at ~12 hours, saving ~12 hours compared to the EPA-approved method.
We present a virtual immunohistochemical (IHC) staining method based on label-free autofluorescence imaging and deep learning. Using a trained neural network, we transform multi-band autofluorescence images of unstained tissue sections to their bright-field equivalent HER2 images, matching the microscopic images captured after the standard IHC staining of the same tissue sections. Three pathologists’ blind evaluations of HER2 scores based on virtually stained and IHC-stained whole slide images revealed the statistically equivalent diagnostic values of the two methods. This virtual HER2 staining method provides a rapid, accurate, and low-cost alternative to the standard IHC staining methods and allows tissue preservation.
Immunohistochemical (IHC) staining of the human epidermal growth factor receptor 2 (HER2) is routinely performed on breast cancer cases to guide immunotherapies and help predict the prognosis of breast tumors. We present a label-free virtual HER2 staining method enabled by deep learning as an alternative digital staining method. Our blinded, quantitative analysis based on three board-certified breast pathologists revealed that evaluating HER2 scores based on virtually-stained HER2 whole slide images (WSIs) is as accurate as standard IHC-stained WSIs. This virtual HER2 staining can be extended to other IHC biomarkers to significantly improve disease diagnostics and prognostics.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.