Single-pixel imaging (SI) methods have been widely studied in recent years. However, most existing research have illustrated SI as an attractive prospect when multipixel sensors are not preferable due to cost or technological constraints. Parallel single-pixel imaging (PSI) has been introduced recently such that SI techniques is also attractive when multipixel sensors are easily obtained and provides new possibilities to traditional challenging problems. By treating each pixel on the camera as an independent SI unit, PSI captures light transport coefficients (LTCs) in highly efficient manner, and separates direct illumination and global illumination, thereby enabling 3D reconstruction under global illumination. Localization stage is of great importance for PSI, where the approximate information for visible region is obtained. In this paper, we analyze the robustness property of PSI with respective to localization stage. Robustness property of PSI states that the accuracy of 3D reconstruction data by PSI is insensitive to errors incurred in the localization stage. Firstly, we show this property from a theoretically aspect. Then, we conduct two experiments to test this property. Lastly, we also show that satisfactory 3D reconstruction results can be obtained when only partial frequencies are captured in the localization stage for a higher efficient data capture procedure.
Fringe projection profilometry is an efficient, fast and non-contact 3D measurement technique, widely used in industrial parts measurement. However, invalid phases are common when measuring step edges such as holes, ribs and steps in industrial parts which leads to outliers in reconstructed point cloud and ultimately causes reduction of valid point cloud and dimensional measurement errors. In this paper, an error compensation method on 3D measurement of step edge is proposed. 3D measurement space is firstly divided into multiple subspaces based on binocular camera system parameters. Then a calibration method is proposed to calculate the amount of compensation for each subspace and obtain parameters of edge error model. After calibration, point cloud of object with step edge is reconstructed by projecting phase-shifting structured light fringe patterns. By using Principal Component Analysis (PCA), normal estimation of point cloud is implemented. And edge feature is extracted with combination of eigenvalues variation of the covariance matrix and first- and second-order fitting based on two-dimensional projection. At the same time, the corresponding deviation amount of each edge point is solved on the basis of aforementioned edge error model. Finally, the accurate step edge is obtained after error compensation according to the normal direction and deviation amount. Experimental results show that, the method proposed can effectively improve the 3D measurement accuracy of step edge.
Fringe projection profilometry is widely used in manufacturing and the accuracy analysis is the key to promote this technology in engineering applications. Research analyzes influencing factors including gamma effect, intensity noise, defocus and methods are proposed to improve the measurement accuracy. However, an analytical study is difficult to perform, and the surface shape of the measuring objects influence the fringe images which needs to be considered. In this paper, raytracing algorithm and back-propagation network are used to study the relationship between the surface shape and measurement accuracy. The fringe projection profilometry system is simulated in computer using the raytracing algorithm and the light transport coefficients are measured to improve the accuracy of the camera defocus simulation. The impact of surface shape on fringe images is analyzed, and the projection and observation angles are used as the input of the network. The truth value of the surface is known in the simulation model thus the error of the coordinate can be obtained after simulation measurement and used as the output of the network. Experiment shows that, high correlation exists between the surface shape and the coordinate error.
The blur of the optical system can cause inevitable degradation of acquired images. In this paper we present a novel method to measure the spatially-varying blur of the camera lens. We obtained the Discrete Cosine Transform (DCT) coefficients of the blur kernels by applying DCT single-pixel imaging to all the camera pixels. The spatially-varying blur kernels are then reconstructed by applying inverse DCT to the acquired coefficients. Experimental results show that the proposed method can acquire a more accurate blur kernel compared to the traditional Gaussian kernel.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.