Microscopic cell image analysis is indispensable to cell biology. Images of cells can easily degrade due to optical diffraction or focus shift, as this results in low signal-to-noise ratio (SNR) and poor image quality, hence affecting the accuracy of cell analysis and identification. For a quantitative analysis of cell images, restoring blurred images to improve the SNR is the first step. A parameter estimation method for defocused microscopic cell images based on the power law properties of the power spectrum of cell images is proposed. The circular radon transform (CRT) is used to identify the zero-mode of the power spectrum. The parameter of the CRT curve is initially estimated by an improved differential evolution algorithm. Following this, the parameters are optimized through the gradient descent method. Using synthetic experiments, it was confirmed that the proposed method effectively increased the peak SNR (PSNR) of the recovered images with high accuracy. Furthermore, experimental results involving actual microscopic cell images verified that the superiority of the proposed parameter estimation method for blurred microscopic cell images other method in terms of qualitative visual sense as well as quantitative gradient and PSNR.
The application of machine vision measurement system is developing rapidly in industry for its non-contact, high speed, and automation characteristics. However, there are nonlinear distortions in the images which are vital to measuring precision, for the object dimensions are determined by the image properties. People are interested in this problem and put forward some physical model based correction methods which are widely applied in engineering. However, these methods are difficult to be realized in workshop for the images are non-repetitive interfered by the coupled dynamic factors, which means the real imaging is a stochastic process. A new nonlinear distortion correction method based on a VNAR model (Volterra series based nonlinear auto-regressive time series model) is proposed to describe the distorted image edge series. The model parameter vectors are achieved by the laws of data. The distortion-free edges are obtained after model filtering and the image dimensions are transformed to measuring dimensions. Experimental results show that the method is reliable and can be applied to engineering.
Machine vision is used to assess surface roughness of work pieces in different circumstance light conditions. Different from the traditional stylus way, the machine vision method is a non-contact and non-destructive surface inspection method, and therefore has the prospect in manufacturing application. The effect of ambient light on gray-level distribution parameters was first analyzed. Then a new method, which counts the pixel numbers whose gray value is larger than sum of the Otsu threshold and a given constant, was proposed to represent different roughness. Finally some experiments were conducted to compare the proposed method with other two methods-the first is the method based on standard deviation and root mean square height of the gray-level distribution proposed by Luk, and the second is on gray-level co-occurrence matrix-to evaluate surfaces with different roughness in different conditions. The parameters, such as the constant of the proposed method, the distance and the angle of the gray-level co-occurrence matrix, were optimized to obtain better inspecting performance. Results show that the variance sum between inspecting values of the proposed method and its corresponding real surface roughness is less than half of the other two ways.
Proceedings Volume Editor (1)
This will count as one of your downloads.
You will have access to both the presentation and article (if available).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.