Due to the rise of social media, synthesized or composited photos are becoming increasingly widespread, and image relighting is one of the crucial technologies that is capable of creating convincingly realistic images. Our study proposes a framework for relighting a portrait subject when superimposing it onto a 360-deg image. In most image compositions, it is difficult to acquire the 3D shapes of subjects directly to rerender them in a virtual environment. A well-diffused color portrait image with a corresponding normal map is generated from our photo booth using a photometric method. In addition, a virtual environment based on a principled bidirectional scattering distribution shader and environmental 360-deg texture in the Blender software is utilized to create composite images. After considering different situations, including gender, postures, indoor or outdoor scenes, and color or color-free subjects, each of 128 composite images was played as a 4-s video clip, and various scenarios were conducted for subjective assessment. From the evaluation scores of the 30 participants, the overall satisfaction with the image composition based on the proposed framework was above average (5-point Likert scale > 3 points), and the color-free subject in the 360-deg image was significantly preferred.
Structured-light systems consisting of a camera and projector are powerful and cost-effective tools for three-dimensional (3-D) shape measurements. However, most commercial projectors are unable to generate distinct patterns due to defocusing and shallow focusing issues. We propose a hybrid method for enhancing the calibration and scanning features of the defocusing structured-light 3-D scanning system. Instead of using conventional sequential binary patterns, we replace the highest-level binary pattern by a high-order sinusoidal pattern. In our proposed system, a pan-tilt stage carrying a checkerboard is used to assist the simultaneous calibration of the camera and projector. Initially, the camera is calibrated to obtain the extrinsic positions of the stage. In addition, we utilize the multiplication of vertical and horizontal stripe patterns to enhance the corresponding features between the camera and projector. The projector is then calibrated using the extrinsic features determined from the calibrated camera. The experimental results show that the use of the high-order sinusoidal pattern significantly improves reprojection error. Our proposed method can easily be incorporated in the defocusing projector for scanning various types of objects.
KEYWORDS: 3D modeling, 3D displays, 3D image processing, Visualization, Stereoscopic cameras, Cameras, Algorithm development, Image fusion, Eye, Visual process modeling
This paper addresses how to define and control the perceived depth when rendering a single stereoscopic three-dimensional (3-D) model. In most 3-D manipulation software, the control of view navigation is the most important feature needed to visualize 3-D scenes of good quality. In a stereoscopic 3-D situation, however, this becomes more complex. We used two factors, parallax range and average parallax, to quantify the 3-D effect of rendering a 3-D model. After an experiment using subjective questionnaires, the fusional limit and depth perception of 22 subjects were regressed as paraboloid functions of parallax range and average parallax. Then, the comfort region, which is defined according to these parameters, was used again for developing an auto-adjustment algorithm for stereoscopic view navigation. This algorithm iteratively adjusts the parameters of a virtual stereo camera and simultaneously restrains parallax range and average parallax within a comfortable region. Finally, by using questionnaires and critical fusion frequency tests, we verified that this algorithm can significantly improve the comfort index of a user in customary operations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.