As the primary method for real-time image processing, a field-programmable gate array (FPGA) is widely used in binocular vision systems. Distortion correction is an important component of binocular stereo vision systems. When implementing a real-time image distortion correction algorithm on FPGA, problems, such as insufficient on-chip storage space and high complexity of coordinate correction calculation methods, occur. These problems are analyzed in detail in this study. On the basis of the reverse mapping method, a distortion correction algorithm that uses a lookup table (LUT) is proposed. A compression with restoration method is established for this LUT to reduce space occupation. The corresponding cache method of LUT and the image data are designed. The algorithm is verified on our binocular stereo vision system based on Xilinx Zynq-7020. The experiments show that the proposed algorithm can achieve real-time and high precision gray image distortion correction effect and significantly reduce the consumption of on-chip resources. Enough to meet the requirements of accurate binocular stereo vision system.
The accuracy of ultrasound/computed tomography (CT) image registration is the key to ultrasound-guided intervention. Thus, the aim of this study is to address the limitation of current image similarity measures to evaluate the accuracy of ultrasound/CT image registration correctly. In this study, an ultrasound/CT image registration method based on simulated transformation optimization is presented. The approach initially preprocesses the ultrasound/CT images for registration through the tensor principal component analysis method to reduce the influence of noise on registration accuracy. Multiscale enhancement algorithm is also adopted to enhance the tubular structures of the CT images. Simulated transformation optimization based on the CT images is then provided. Afterward, given the ultrasonic imaging parameter estimates, the method captures a CT section to obtain ultrasonic images by simulation. The ultrasonic simulation is introduced into the image similarity measure, and the simulation transformation correlation measure is established. The transformation matrix is optimized by the conjugate direction acceleration algorithm to realize the fast and accurate registration of the ultrasound/CT image. Experimental results demonstrate that when Correlation of Simulation Transformation is employed as the similarity measure, the variation range of the six parameters in the transformation matrix is ±0.01, and the ultrasound/CT image registration method based on simulated transformation optimization can rapidly and accurately register ultrasound/CT images. The accurate registration of ultrasound/CT images enables the combination between real-time ultrasonic images and preoperative CT images. Hence, it has the potential to be utilized for ultrasound-guided surgical navigation in clinical practice.
An accurate vertebrae segmentation in the spine is an essential pre-requisite in many applications of image-based spine assessment, surgical planning, clinical diagnostic treatment, and biomechanical modeling. In this paper, we present the stacked sparse autoencoder (SSAE) model for the segmentation of vertebrae from CT images. After the preprocessing step, we extracted overlapped patches from the vertebrae CT images as the inputs of our proposed model. The SSAE model was trained in an unsupervised way to learn high-level features from the input pixels of the unlabeled images patch. To improve the discriminability of the learned features, we further refined the feature representation in a supervised fashion and fine-tuned the whole model by using the feedforward neural network parameters for classifying the overlapped patches. We then validated our model on a publicly available MICCAI CSI2014 dataset and found that our model outperforms the other state-of-the-art methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.