Open Access Paper
17 September 2019 System calibration and characterization of an ultra-compact multispectral snapshot imaging system
M. Hubold, R. Berlich, R. Brüning, R. Brunner
Author Affiliations +
Proceedings Volume 11144, Photonics and Education in Measurement Science 2019; 111440V (2019) https://doi.org/10.1117/12.2532317
Event: Joint TC1 - TC2 International Symposium on Photonics and Education in Measurement Science 2019, 2019, Jena, Germany
Abstract
In this work, we present a calibration procedure of a multispectral snapshot camera, which is based on a multi-aperture system approach combined with a slanted linear variable spectral filter. The ultra-compact multispectral imaging system exploits state of the art micro-optical manufacturing techniques on wafer-level, which leads to a size of only 60 × 60 ×28 mm3. The setup enables the single-shot acquisition of 66 spectral channels with a linear spectral sampling over an extended wavelength range of 450-850 nm and a spatial sampling of 400×400 pixels per channel at a large field of view of 68°. In particular, we propose a spectral and spatial calibration procedure in order to extract hyperspectral data cubes from the acquired raw image and further, to analyze characteristic system parameters. Finally, we demonstrate the systems capabilities for advanced object classification using characteristic spectral indices by utilizing a customized multispectral analysis.

1.

INTRODUCTION

Multispectral camera systems are widely studied and applied to a large range of applications such as remote sensing for ecology and geoscience1, biomedical imaging2 and surveillance3. The main benefit of multispectral imaging systems is the acquisition of spatially resolved spectral information of the measurement scene. This leads to a detection of threedimensional datasets (data cubes) consisting of two spatial and one spectral dimension. The spectral detection typically includes wavelength bands in the ultra-violet, visible, and/or infrared domain. Commonly, classical spectrometer concepts with dispersive optical elements such as prisms and gratings are applied in combination with a complex scanning mechanisms like whiskbroom and pushbroom scanners or staring systems4. In all scanning systems, however, either the object or the imaging system has to be moved, which requires complex and bulky setups as well as an increase in acquisition time. The latter limits the field of application to static or only slowly varying measurement scenes to avoid motion artifacts. Therefore, non-scanning or snapshot imaging system are of particular interest which provide the ability to acquire the entire data cube simultaneously. The lack of scanning artifacts and a remarkable compactness5 represent the major advantage of respective systems.

An ultra-compact realization of such a snapshot system was realized using a micro-optical, multi-aperture system approach combined with a slanted linear variable spectral filter6. Here, we detail the required calibration procedure, which is customized to the multi-aperture imaging approach and the filter based wavelength separation. Finally, we demonstrate the capabilities of our snapshot multispectral camera. An advanced object classification is exemplarily performed for a captured lab scene.

2.

WORKING PRINCIPLE AND OPTICAL DESIGN

Our proposed snapshot multispectral imaging system comprises a stack of three microstructure optical layers integrated close to an image sensor, as shown in Fig. 1. A customized microlens-array (MLA) constitutes the main layer, which enables the parallel imaging of the object scene using a single image sensor. Several aperture layers are included at the front- and backside of the MLA to minimize stray light in our multiplexed microobjective approach. In order to avoid crosstalk between neighboring imaging channels, a baffle array, which is a customized three dimensional aperture structure, is integrated as second layer. At the front side, this baffle structure is characterized by an array of circular crosssections that match the size of the individual MLA apertures. The circular shape merges into a quadratic cross-section at the back side to maximize the field of view of the individual optical channels and efficiently utilize the available sensor area.

Figure 1.

(a) Cross-section of the optical concept for the ultra-compact multispectral camera system with a linear variable filter (LVF), a microlens-array (MLA) and a customized baffle array. (b) Image of the final multispectral camera demonstrator.

00039_PSISDG11144_111440V_page_2_1.jpg

In this basic configuration of our multi-aperture imaging approach, all parallel channels image the same object information without spectral discrimination. A third layer is added in order to tailor each channel to a dedicated wavelength range. A linear variable band-pass filter (LVF, LF1032457) from Delta Optical Thin Film A/S is used, which is placed close to the aperture surface of our MLA. Due to the spatial variation of the pass band of the LVF in combination with the discrete filter sampling by the MLA, each micro optical channel images a different wavelength interval. Due to the integration of several spectrally individualized imaging channels at one sensor, our system approach is capable for single-shot multispectral imaging.

If the wavelength gradient of the LVF is oriented parallel to the MLA axis, several optical channels would detect an identical wavelength as shown by the blue graph in Fig. 2b). Therefore, the LVF is slightly tilted with respect to the array axis as shown in Fig. 2a). The tilt angle α depends on the number of MLA channels n in the y-direction and the pitch px and py of the microlens centers in x and y, respectively. It is calculated by

00039_PSISDG11144_111440V_page_3_1.jpg

The partial images yield the best fill-factor on the image sensor if px and py are equal. The equation for the tilt angle can thus be simplified to

00039_PSISDG11144_111440V_page_3_2.jpg

The LVF has a transmission wavelength range from 450-880 nm in a field of 25x36 mm2 and directly corresponds to the size of the full frame format CCD image sensor. In order to achieve a good compromise between spectral sampling and high spatial resolution over the extended wavelength range, an array of 11x6 microlens channels is used. Thus, the tilt angle α between LVF and MLA is approximately 9.5° according to Eq. (2). In accordance to this adapted configuration, the multispectral camera achieves a uniform spectral sampling of the entire free spectral range provided by the LVF as depicted in Fig. 2 b) marked by the red graph.

Figure 2.

(a) Front-view of the multispectral imaging concept based on a multi-aperture system approach with a slanted linear variable spectral filter. (b) Spectral sampling for a tilted (red) and non-tilted (blue) orientation of the LVF.

00039_PSISDG11144_111440V_page_2_2.jpg

A raw image of our final system is depicted in Fig. 3 (a), where the 66 individuall sub-images are arranged in an 11x6 array. Each image channel has a field of view of 68° and an image resultion of up to 400x400 pixels. In the used configuration, the object scene, shown in Fig. 3(b), is spectrally sampled with a step size of 6 nm and cover the spectral range between 450 nm and 850 nm.

Figure 3.

(a) Raw image of the multispectral camera with an array of 11x6 sub-images. (b) RGB-image of the object scene using a conventional camera for comparison purposes. (c) Magnified image of one separated channel.

00039_PSISDG11144_111440V_page_3_3.jpg

3.

SYSTEM CALIBRATION AND CHARACTERIZATION

Using the described multi-aperture camera approach enables the imaging of an object scene at 66 different wavelengths simultaneously. However, an image fusion is required to generate the multispectral data cube and to analyze the spatially resolved spectral information for multispectral imaging applications. In addition to conventional image calibration and correction schemes, such as fixed pattern noise or distortion correction, our multispectral camera approach requires two additional calibration steps. These steps influence both the spectral and spatial image fusion to compensate for the angle dependent transmission properties of the LVF and the disparity of the MLA imaging scheme, respectively.

At first, the spatial calibration of all channels with respect to the optical axis of the overall system was done to compensate for the disparity introduced by the lateral arrangement of the imaging channels. The non-coaxial arrangement of the channels leads to a slightly different viewing direction of each micro-objective and finally results in a relative displacement of the spatial coordinates within the sub-images coordinate system. A distance dependent relative calibration of the subimages is performed by a point source in order to compensate for this behavior. Therefore, the point source is imaged at various distances by the multispectral camera system. Afterwards, the barycenter of the spot images is determined for each sub-image individually and is finally used as origin of the local coordinate systems in order to overlay all 66 sub-frames for the multispectral data cube. For large distances over 2 m, as common for remote sensing applications, the disparity becomes smaller than one pixel between neighboring channels and a static image fusion scheme can be used instead of a distance dependent image fusion.

In a second step, the spectral data fusion has to be calibrated. This becomes necessary due to the blue shift of the used interference filter for increasing angles of incidence. For our system with the filter located in the aperture plane of the system, the incident angle on the filter is directly correlated to the radial image height. Thus, each sub frame sees a small radial wavelength gradient as illustrated in Fig. 4. In order to calibrate for this effect, the spectral response of each pixel is characterized by imaging a diffusing screen close to the camera, which was illuminated by a monochromatic tunable light source. The monochromator was tuned in the wavelength range between 430 nm and 871 nm with a sampling of 3 nm. At each position, a camera image was taken including the current bright pixels. Using the whole image stack we were able to fit a Gaussian function to each spectral pixel response function and to derive the central wavelength and full-width half maximum (FWHM). By this procedure, an extensive look-up-table is obtained, which contains the actual spectral response for each fused image position. The linear spectral sampling between corresponding pixels is approximately 6 nm and the FWHM varies between 10 nm and 14 nm depending on the wavelength. In combination with our dense spectral sampling, with a sampling width smaller than the transmission width of the filter, we were able to reconstruct comparable spectral information for each point over the full field of view.

Figure 4.

(a) Spatial dependency of the transmitted wavelength shown for six imaging channels of the multispectral array camera system. (b) Wavelength fit of channel 27 (false color representation of transmitted wavelength).

00039_PSISDG11144_111440V_page_4_1.jpg

After calibrating the spatial and spectral data fusion of the 66 individual channels, finally, a spectral calibration of the total system including the illumination of the test object scene is required for the measurements. For this, a spectrally homogenous reflectance standard white target is imaged. This white reference image contains all spectral variations of the source, the filter as well as the sensor sensitivity and is afterwards used to normalize the measured data in order to see only the relative spectral reflectance of the objects in the measurement scene.

4.

EXPERIMENTAL RESULTS

After the aforementioned calibration procedure, the multispectral snapshot imaging system is applied to perform test measurements in order to demonstrate the combination of the high spatial and spectral sampling ability. We used a typical test scene consisting of different green leaves of the same plant at different stages of aging. Their respective color varies from yellowish to different shades of green (see Fig. 3 (b)). Note that a black absorptive background is utilized to minimize stray light effects. The image distance was about 50 cm and a halogen lamp with a color temperature of about 2900 K was used for illumination.

After capturing the image, the data fusion of a monochrome image of the test scene could be reconstructed as shown in Fig. 5 (a). Here, the third dimension of the multispectral data is removed to emulate the appearance of a classical image without any spectral filtering. As it can be seen by the sharp edges between the leaves and the background, the disparity effect of our system is completely compensated by our calibration procedure. The resulting spatially resolved spectral data is illustrated in Fig. 5(b) for five different image positions corresponding to different leaves in our test scene. It can be observed that the green local reflectance maxima of the individual spectra are centered around 550 nm, which determines to the visual appearance of the plant leaves. The well-known red-edge8 around 700 nm is also clearly observable. This high increase of reflected light in the near infrared domain is characteristic for the chlorophyll in the leaves. Additionally, the reflectance spectrum in the range between 500 nm and 700 nm decreases expectedly with increasing chlorophyll content from yellowish-green to dark-green leaves9. Furthermore, the inflection point’s wavelength position of the red-edge is shifted slightly with increasing chlorophyll content of the leaves9. Due to this behavior, the red-edge can be used for advanced spectral image classification, as depicted in Fig. 5 (c), where the normalized difference vegetation index was calculated. The dead leaf appears red in the respective false color plot which indicates the lack of chlorophyll, whereas healthy leaves appear in a full green. The reduced amount of chlorophyll in very young or old leafs is visualized by a reduced NDVI index.

Figure 5.

(a) Reconstructed gray-scale image of the lab scene showing different leaves of one plant at various stages of development. Image cropped to 220 x 220 pixels. (b) Reconstructed reflection spectra of the different leaves, the color of the graphs correspond to the position marks in (a). (c) Advanced object classification by the normalized difference vegetation index (NDVI) determined at 624 nm and 791 nm.

00039_PSISDG11144_111440V_page_5_1.jpg

5.

CONCLUSION

We have demonstrated an innovative concept for the realization of an ultra-compact snapshot multispectral camera, which was realized by the combination of micro-optical components and a linear variable band-pass filter. Due to the parallelization of multiple imaging channels in a lateral array and the angle of incidence dependency of the used LVF, customized calibration procedures were performed for the multispectral data fusion. Additionally, we have demonstrated the suitability of our multispectral camera approach together with the discussed calibration procedure by an example measurement. The high spatial and spectral resolution of our system was demonstrated by the investigation of green leaves at various stages of development, which could be distinguished by the shift of the characteristic chlorophyll red-edge around 700 nm.

In conclusion, the presented calibration procedure enables the complete image fusion for a multi-aperture camera array with an interference filter integrated into the aperture plane for multispectral data acquisition.

ACKNOWLEDGMENTS

The performed work is part of the project “micro- and nano structured optics for infrared (MIRO)”, which is funded by the German Federal Ministry of Education and Research (BMBF), the Thuringian Ministry of Economy, Science and Digital Society, as well as the Fraunhofer Foundation. The authors would like to thank Bernd Höfer for the mechanical system layout and integration, Simone Thau, who participated in the fabrication of the microoptics module, and Elisabeth Montag for performing characterization measurements of the individual components.

REFERENCES

[1] 

Benediktsson, J. A., Palmason, J. A. and Sveinsson, J. R., “Classification of hyperspectral data from urban areas based on extended morphological profiles,” IEEE Trans. Geosci. Remote Sensing, 43 (3), 480 –491 (2005). https://doi.org/10.1109/TGRS.2004.842478 Google Scholar

[2] 

Vo-Dihn, T., Stokes, D. L., Wabuyele, M. B., Martin, M. E., Song, J. M., Jagannathan, R., Michaud, E., Lee, R. J. and Pan, X., “A hyperspectral imaging system for in vivo optical diagnostics,” IEEE Eng. Med. Biol. Mag, 23 (5), 40 –49 (2004). https://doi.org/10.1109/MEMB.2004.1360407 Google Scholar

[3] 

Gowen, A., Odonnell, C., Cullen, P., Downey, G. and Frias, J., “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends in Food Science & Technology, 18 (12), 590 –598 (2007). https://doi.org/10.1016/j.tifs.2007.06.001 Google Scholar

[4] 

The SAGE handbook of remote sensing, SAGE, Los Angeles, Calif (2009). https://doi.org/10.4135/9780857021052 Google Scholar

[5] 

Hagen, N., Kester, R. T., Gao, L. and Tkaczyk, T. S., “Snapshot advantage. A review of the light collection improvement for parallel high-dimensional measurement systems,” Optical engineering (Redondo Beach, Calif.), 51 (11), (2012). Google Scholar

[6] 

Hubold, M., Berlich, R., Gassner, C., Brüning R., Brunner R., “Ultra-compact micro-optical system for multispectral imaging,” 105450V (2018). Google Scholar

[8] 

Filella, I. and Penuelas, J., “The red edge position and shape as indicators of plant chlorophyll content, biomass and hydric status,” International Journal of Remote Sensing, 15 (7), 1459 –1470 (1994). https://doi.org/10.1080/01431169408954177 Google Scholar

[9] 

Buschmann, C. and Lichtenthaler, H. K., “Contribution of Chlorophyll Fluorescence to the Reflectance of Leaves in Stresse Plants as Determined with the VIRAF-Spectrometer,” Z. Naturforsch, 54c 849 –855 (1999). https://doi.org/10.1515/znc-1999-9-1035 Google Scholar
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
M. Hubold, R. Berlich, R. Brüning, and R. Brunner "System calibration and characterization of an ultra-compact multispectral snapshot imaging system", Proc. SPIE 11144, Photonics and Education in Measurement Science 2019, 111440V (17 September 2019); https://doi.org/10.1117/12.2532317
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Imaging systems

Calibration

Cameras

Multispectral imaging

Image fusion

Optical filters

Image filtering

Back to Top