Image segmentation is one primary area in which deep learning has made a major contribution to medical
image analysis. The automatic and precise segmentation of cells in cytopathology, or cytology for short, can
significantly reduce the diagnostic work from pathologists. The biomedical image segmentation task routinely
employs an encoder-decoder structure, e.g. U-Net, in which the receptive field is often fixed. However, to achieve
a better morphological segmentation performance, we empirically found receptive field should be correlated
with cell size by differential structures. In this paper, we proposed a novel deep-learning based cytology image
segmentation model, namely CellSegNet. This model can dynamically catalog cells by their size, and subsequently
fit to their corresponding light-weight structures, characterized with weighted multiple receptive fields to better
retrieve feature extraction. The proposed model can outperform other state-of-art biomedical image segmentation networks with observable improvements. Moreover, the high interpretability of the proposed model can be flexibly extended to other cytology datasets. The source code in the experiments and part of our collection of cervical images are publicly available at https://github.com/SJTU-AI-GPU/CellSegNet.
Federated learning (FL) contributed to the popularity of using multi-centric and collaborative integration of decentralized data across multiple institutions to train a robust model. Yet, data heterogeneity among the distributed clients remains a big challenge in FL. Specifically, in the case of digital histology, stain variation is commonplace. In a collaboration setting, color normalization and data augmentation are adopted to alleviate the variation. But they cannot be directly applied in the FL paradigm, as the requirement of data sharing or the availability of proxy samples. To address this issue, we propose a novel personalized federated learning (PFL) approach. Personalization stain transfer layers learn to project stain variant input images into a homogeneous space before being fed to the FL backbone. The proposed method is simple yet efficient. Empirical results on public available large patient cohorts demonstrate an observable classification accuracy improvement on popular neural network architectures including ResNet, Vgg, Wide ResNet.
Computational histopathology algorithms can interpret very large volumes of data, which can navigate pathologists to assess slides promptly, and also aid in the localization and quantification of abnormal cells or tissues. In recent years, taking place of conventional imaging processing methods, deep learning has become the mainstream methodology to interpret cancer pathology images. However, similar as conventional computer vision methods, stain normalization in tissue identification with convolutional neural networks (CNNs) is still essential for the diagnostic accuracy. Traditional prior knowledge-oriented color matching, as well as a particular style based pure learning in generative adversarial networks, may be encompassed with accuracy decrease when data centers are many. In this paper, we propose a novel color normalization method with a conditional generative adversarial network (cGAN). It is a learning-based interpolation approach with probability distribution space on multiple datasets training. A target template is designed to be label-dependent to overcome the improper color mapping problem caused by data heterogeneity. The tests are performed on histopathology datasets from The Cancer Genome Atlas (TCGA) and the proposed method outperforms other previous works in classification accuracy. This approach has potential in clinical practice for better recognition of cancer in digital pathology and can be implemented in a decentralized setting.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.