KEYWORDS: 3D modeling, 3D image reconstruction, Cameras, 3D image processing, Integral imaging, Data modeling, Clouds, Imaging systems, Image quality, 3D displays
An integral imaging system using a polygon model for a real object is proposed. After depth and color data of the real object are acquired by a depth camera, the grid of the polygon model is converted from the initially reconstructed point cloud model. The elemental image array is generated from the polygon model and directly reconstructed. The polygon model eliminates the failed picking area between the points of a point cloud model, so at least the quality of the reconstructed 3-D image is significantly improved. The theory is verified experimentally, and higher-quality images are obtained.
KEYWORDS: Clouds, 3D image processing, Cameras, 3D displays, Adaptive optics, 3D modeling, Image quality, Digital micromirror devices, Image resolution, Mirrors
A novel 360-degree integral-floating display based on the real object is proposed. The general procedure of the display system is similar with conventional 360-degree integral-floating displays. Unlike previously presented 360-degree displays, the proposed system displays the 3D image generated from the real object in 360-degree viewing zone. In order to display real object in 360-degree viewing zone, multiple depth camera have been utilized to acquire the depth information around the object. Then, the 3D point cloud representations of the real object are reconstructed according to the acquired depth information. By using a special point cloud registration method, the multiple virtual 3D point cloud representations captured by each depth camera are combined as single synthetic 3D point cloud model, and the elemental image arrays are generated for the newly synthesized 3D point cloud model from the given anamorphic optic system’s angular step. The theory has been verified experimentally, and it shows that the proposed 360-degree integral-floating display can be an excellent way to display real object in the 360-degree viewing zone.
KEYWORDS: Integral imaging, Displays, LCDs, Cameras, Parallel processing, Parallel computing, Image processing, 3D image processing, 3D image reconstruction, 3D displays
A depth camera has been used to capture the depth data and color data for real-world objects. As an integral imaging display system is broadly used, the elemental image array for the captured data needs to be generated and displayed on liquid crystal display. We proposed a real-time integral imaging display system using image processing to simplify the optical arrangement and graphics processing unit parallel processing to reduce the time for computation. The proposed system provides elemental images generated at a rate of more than 30 fps with a resolution of 1204×1204 pixels , where the size of each display panel pixel was 0.1245 mm, and an array of 30×30 lenses , where each lens was 5×5 mm .
KEYWORDS: 3D image processing, 3D displays, Mirrors, Integral imaging, Projection systems, Fresnel lenses, Digital micromirror devices, 3D vision, Diffusers, Image resolution
We propose full-parallax integral imaging display with 360 degree horizontal viewing angle. Two-dimensional (2D)
elemental images are projected by a high-speed DMD projector and integrated into three-dimensional (3D) image by a
lens array. The anamorphic optic system tailors the horizontal and vertical viewing angles of the integrated 3D images in
order to obtain high angular ray density in horizontal direction and large viewing angle in vertical direction. Finally, the
mirror screen that rotates in synchronization with the DMD projector presents the integrated 3D images to desired
direction accordingly. Full-parallax and 360 degree horizontal viewing angle 3D images with both of monocular and
binocular depth cues can be achieved by the proposed method.
We propose a new synthesis method for the hologram of 3D objects using multiple orthographic view images captured
by lens array. The 3D objects are captured through a lens array under normal incoherent illumination, and their multiple
orthographic view images are generated from the captured image. Each orthographic view image is numerically
overridden by the plane wave propagating at the direction of the corresponding projection angle and integrated into a
single complex value, which constitutes one pixel in the synthesized hologram. By repeating this process for all
orthographic view images, we can generate the Fourier hologram of the 3D objects. Since the proposed method generates
the hologram not from the interference with the reference beam, but from the multiple view images, coherent system is
not required. The manipulation of the 3D information of the objects is also easily achieved in the proposed method. By
manipulating coordinate information of each orthographic view image according corresponding view angle, the depth
order of the reconstructed 3D object can be controlled.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.