The chromatic confocal microscope is an innovative method to acquire axial information simultaneously but suffers from limited imaging depth. We designed an 800um chromatic shift chromatic objective and manufactured it by precision diamond turning. By utilizing this objective, our chromatic confocal system can capture a large depth axis information simultaneously without stage scanning, while maintaining 780um lateral resolution. Biology tissues are visualized under this microscope to evaluate its performance.
KEYWORDS: Education and training, Tumors, Image classification, Deep learning, Data modeling, Multiphoton microscopy, Machine learning, Second harmonic generation, RGB color model, Tissues
Pancreatic neuroendocrine tumors (PNETs) present significant diagnostic and therapeutic challenges due to their heterogeneity and complex nature as a subtype of pancreatic cancer. The treatment approach varies considerably based on the tumor's location, grading, and focality. Accurate prognosis and management typically necessitate the expertise of a pathologist to evaluate histological slides of the tissue, a process that is often time-consuming and labor-intensive. Developing point-of-care techniques for automatic classification of PNETs would greatly improve the ability to treat and manage this disease by providing real-time decision-making information. In response to these challenges, our study introduces a highly efficient and versatile diagnostic strategy. This innovative approach synergistically integrates label-free multiphoton microscopy with finely adjusted, pre-trained deep learning models, optimized for performance even with limited data availability. We have meticulously optimized four pre-trained convolutional neural networks, utilizing a dataset comprising only 49 images, which includes both two-photon excitation fluorescence and second-harmonic generation imaging. This refined approach has resulted in an impressive average classification accuracy of over 95% for the development dataset and more than 90% for the test dataset. These results are significantly superior when compared to the preoperative misdiagnosis rates of conventional diagnostic modalities such as ultrasound (US) and computed tomography (CT), which stand at 81.8% and 61.5%, respectively. This methodology represents a significant advancement in the diagnostic process for PNETs, promising a more streamlined, rapid, and accurate approach to treatment. Furthermore, it opens substantial potential for the automated classification of various tumor types using multiphoton microscopic imaging, even in scenarios characterized by limited data availability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.