Presentation
20 August 2020 Real-time localization and classification for digital microscopy using single-shot convolutional neural networks
Martin Fränzl, Frank Cichos
Author Affiliations +
Abstract
We present an adapted single shot neural network architecture (YOLO) for the real-time localization and classification of particles in optical microscopy. Our work is aimed at the manipulation of microscopic objects in real-time by a feedback loop. The network is implemented in Python/Keras using the TensorFlow backend. The trained model is then exported to a GPU supported C library for real-time inference readily integrable in other programming languages such as C++ and LabVIEW. It is capable of localizing and classifying several hundred of microscopic objects even at very low signal-to-noise ratios running for images as large as 416 x 416 pixels with an inference time of about 10 ms. We demonstrate real-time detection in tracking and manipulating active particles of different types. Symmetric active particles, as well as Janus particles propelled by self-thermophoretic laser-induced processes, are identified and controlled via a Photon-Nudging procedure developed in the group.
Conference Presentation
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Martin Fränzl and Frank Cichos "Real-time localization and classification for digital microscopy using single-shot convolutional neural networks", Proc. SPIE 11469, Emerging Topics in Artificial Intelligence 2020, 114691G (20 August 2020); https://doi.org/10.1117/12.2568368
Advertisement
Advertisement
KEYWORDS
Microscopy

Convolutional neural networks

Image processing

LabVIEW

Signal to noise ratio

Computer programming languages

Feedback control

Back to Top