This will count as one of your downloads.
You will have access to both the presentation and article (if available).
Many industrial processes call for determining not only the front but also the rear surface shape, i.e., the material thickness. This is crucial for identifying weak points or potential material savings, for instance, in ampoules. Existing methods for the simultaneous measurement of surface shape and material thickness (e.g., computer tomography) are complex, expensive, slow, and cannot be integrated into production lines. As a result, e.g., container glass manufacturers are actively seeking an alternative solution.
We aim to provide such a solution by enhancing our current process. Instead of a CO2 laser line at λ = 10.6 μm wavelength, which is absorbed at the object’s surface and does not penetrate the material, we use a wavelength in the short-wave infrared (SWIR). At this shorter wavelength, the laser radiation travels through commercially available glasses. At the rear surface, the radiation is partly reflected and reaches the front surface again. Along its path, the radiation is absorbed and leaves a heat trace behind. Whereas common glasses are translucent in the SWIR, they are generally opaque in the LWIR range. Consequently, while some SWIR radiation penetrates the object, LWIR cameras detect heat only at its front surface: (1) at the entering laser line and (2) at the position of the exiting line. Our goal is to use these two thermal signal positions to determine both the front and rear 3D surface shape, and thus the material thickness. In this paper, we investigate our approach theoretically using a simulation model. The model is used to generate thermal points on static measurement objects and determine appropriate parameters such as laser power, angle of incidence, and irradiation time. Furthermore, we analyze the temporal and spatial behavior of the thermal points, considering the material parameters. With the obtained simulated results, we subsequently demonstrate an initial experimental setup. In this setup, the two thermal signals are evaluated on a glass plate for different angles of incidence to determine the material thickness.
Investigation on automated visual SMD-PCB inspection based on multimodal one-class novelty detection
A comparative investigation on the use of compressive sensing methods in computational ghost imaging
Based on the application of fully automated optical inspection of circuit boards at an assembly line, the knowledge of the relative speed of movement between the measurement object and the 3d sensor system should be used inside the algorithms of motion compensation. Optimally, this relative speed is constant over the whole measurement process and consists of only one motion direction to avoid sensor vibrations. The quantified evaluation of this two assumptions and the error impact on the 3d accuracy are content of the research project described by this paper.
For our experiments we use a glass etalon with non-transparent circles and transmitted light. Focused on the circle borders, this is one of the most reliable methods to determine subpixel positions using a couple of searching rays. The intersection point of all rays characterize the center of each circle. Based on these circle centers determined with a precision of approximately 1=50 pixel, the motion vector between two images could be calculated and compared with the input motion vector. Overall, the results are used to optimize the weight distribution of the 3d sensor head and reduce non-uniformly vibrations. Finally, there exists a dynamic 3d measurement system with an error of motion vectors about 4 micrometer. Based on this outcome, simulations result in a 3d standard deviation at planar object regions of 6 micrometers. The same system yields a 3d standard deviation of 9 µm without the optimization of weight distribution.
In this contribution, we present new 3D sensor technologies based on three different methods of near-infrared projection technologies in combination with a stereo vision setup of two cameras. We explain the optical principles of an NIR GOBO projector, an array projector and a modified multi-aperture projection method and compare their performance parameters to each other. Further, we show some experimental measurement results of applications where we realized fast, accurate, and irritation-free measurements of human faces.
For these concepts non-contact measurements with image processing sensors have significant benefits for data acquisition in rapidity and a high grade of flexibility. New effective measurement strategies can be developed in effect of the quality controlling in the machining area. These includes classical geometric measurement applications from optical 2D but also options for 3D measurement tasks like determining roughness or other typical image processing applications.
This paper presents the challenges for the implementation of an optical sensor system in the machining area of milling centers. Primarily a suitable location in the machining area must be found and an associated strategy has to be developed. The integrated optical image sensor system should be protect against impurity and does not derogate in his functionality. For the full integration as a quality control loop, the results must feed into the machine control. Thus a further interface between measurement program and a machine control is necessary.
Another major field of research exists in the optical components. Especially the illumination, image sensor and lens are selected and adaptable for the measurement tasks after the considerations of the above-mentioned basic requirements.
The presented research provides a suitable solution to make the CNC manufacture more efficient. Quality controls of the work piece can be executed within the CNC process and potential post processing can be performed simultaneously.
We propose a novel approach of phase unwrapping without using additional pattern projection. Based on a stereo camera setup, an image segmentation of each view in areas without height jumps larger than a fringe period is necessary. Within these segments, phase unwrapping is potentially without error. Alignment of phase maps between the two views is realized by an identification process of one correspondence point.
View contact details
No SPIE Account? Create one