The multispectral camera reported herein consists of a single aperture optic, splitting the incoming light onto three independent sensors. Having a common beam path, it is possible to employ a motion blur compensating de-scanner unit, allowing the generation of high-speed panoramic stitches for quick look arounds. The fast aperture optics has a F-number spanning from F/2.8 in the wide field of view to F/3.8 in the narrow field of view. Additionally, the optic is corrected for chromaticity errors over the whole wavelength range. For each sensor we managed to maintain focus and exactly match horizontal fields of view over the complete zoom range. The individual imaging sensors employed are a SXGA SWIR sensor, a 4K VIS CMOS sensor and a SXGA low light level sensor for night vision capability. Thus, it is possible to see under almost all possible atmospheric and light conditions. As all sensors can be read out simultaneously with subsequent on-board processing, we can combine image analysis with image fusion such as navigation light detection within SWIR images. Additionally, the SWIR sensor is able to detect laser light using ALPD (asynchronous laser pulse detection).
For appropriate reconnaissance in military and security applications, comprehensive imaging is the basic prerequisite. On the one hand, this requires the use of a wide range of wavelength bands. On the other hand, a higher detector resolution results in a gain of information that can be provided to the user. On the way to highly integrated sensor technology, the use of compact and powerful optics is necessary. This presentation gives an overview of existing Hensoldt Optronics technology and gives an outlook on innovative approaches to modern imaging.
The advantages of SWIR imaging in comparison to imaging in the VIS range are twofold: At first, the observation range is higher due to a better transmission of the atmosphere especially under hazy weather conditions. At second, the target contrast is higher in many relevant cases caused by a reflection behavior of objects differing significantly from what we know from the visible. In this lecture, a method is presented which, similar to the tristimulus theory of the eye, creates a SWIR image consisting of 4 individual images. The recordings are made with bandpass filters at the wavelengths of 1000 nm, 1200 nm, 1400 nm and 1600 nm. The complete SWIR spectrum is spectrally resolved from the overlap of the filter curves.
The advantage of shortwave infrared (SWIR) imaging for reconnaissance tasks and the detection of camouflaged objects are known and demonstrated. The realization with an integrated visual and shortwave camera and discrimination with a four-filter shortwave colorimetry of camouflage objects was shown. Multispectral images are captured using four appropriate bandpass filters in a filter wheel and contrast was optimized by a tunable generic bandpass filter based on principal component analysis (PCA). Based on this experience, it is demonstrated that SWIR multispectral imaging can provide longer observation ranges compared to color imaging although the pixels of color cameras are significantly smaller, resulting in higher spatial resolution.
Reconnaissance tasks and the detection of camouflaged targets can be improved if the different surface reflection conditions of natural and artificial surfaces can be discriminated in multispectral images. Analogue to the visible, colorimetry in the shortwave infrared spectral range has been presented in praxis and theory in [1]. In this presentation we will discuss the influence of a four filter set onto the capability of the system to discriminate different spectral characteristics of two objects. An optimal choice of filters is presented in the sense of a homogenous discrimination characteristics across the SWIR band, taking into account the SWIR solar spectrum. Based on this choice two development issues were investigated: the accuracy of the color values with respect to measurement noise and the display of SWIR color images. As a figure of merit for color value accuracy the Noise Equivalent Wavelength Difference is introduced describing the minimum color difference that can be resolved/discriminated from the noise floor. Due to the lack of a physiologic counterpart as known from the visual colorimetry where the eye is used as reference for the three color channels, we will showcase a model for transforming and displaying SWIR colors in the R-G-B color space.
The typically used shortwave infrared spectral range (SWIR) between 900 nm and 1700 nm is a spectrally broader wavelengths range than the visible range. Available SWIR cameras generate a gray level image using the intensity over the entire spectral band. However, objects can exhibit completely different spectral behavior in this range. Plants have a high reflection at the lower end of the SWIR range and liquid water has a strong absorption band around 1400 nm, for example. We propose to divide the SWIR range into an appropriate number of spectral channels to extract more details from a captured image.
To extract this information the proposal follows a concept similar to color vision of the human eye. Analog to the three types of color receptors of the eye four spectral channels are defined for the SWIR. Each point of the image is attributed now by four “color values” instead of a single gray level.
For a comprehensive characterization of an object, a special SWIR colorimetry is possible by selecting appropriate filters with suitable band width and spectral overlap. The spectral sensitivity, the algorithms for calculating SWIR-color values, the discrimination of SWIR-color values by Noise Equivalent Wavelength Difference (NEWD) and spectral coded false color image display is discussed and first results with an existing SWIR camera are presented.
The reconnaissance capability of a military observation and targeting platform is mainly driven by the performance of the used sensors. In general, the MWIR thermal imager is the primary sensor and the use of a visible camera increases the identification capability of the platform during day for very long observation ranges. In addition to the imaging sensors a laser pointer, a laser rangefinder (LRF), and a combined laser rangefinder/ designator (LRF/D) completes the sensor suite. As LRF a single pulse eye safe rangefinder based on an OPO shifted Nd:YAG transmitter can be used. The alternative LRF/D uses an diode laser pumped dual wavelength OPO/Nd:YAG transmitter and can be operated either at 1570 nm or at 1064 nm with a pulse rate of maximum 25 pps [1].
A MWIR thermal imager [2] with a 1280x1024 MWIR detector and an optical zoom range between 1.2° and 20° horizontal fields of view provides a HD-SDI video stream in the 720p or 1080p standard. A camera build in software image stabilizer and a smart tone mapping algorithm improves the reconnaissance results for the observer.
A combined camera covers the visible, NIR and SWIR spectral range [3] using a common entrance optics. The resolution of the color camera Si-CMOS chip is 1920x1080 and of the InGaAs focal plane array it is 640x512 detector pixel. The combined VIS/NIR/SWIR camera provides improved ranging under hazy and misty atmospheric conditions and also improved detection of laser spots e.g. of the integrated laser designator with high sensitivity in the spectral range between 450 nm up to 1700 nm, most of the military lasers are operating in the NIR and SWIR spectral band [3]. The combination of the sensors in the platform improves significantly the operational use. The application of the described platform is not limited to military scout vehicles, the available sensors are also integrated in a targeting platform with similar performances but other environmental demands.
The possibilities, improvements in comparison of existing platforms and potential upgrades are discussed.
Modern reconnaissance strategies are based on gathering information using as many spectral bands as possible. Besides
the well-known atmospheric windows at VIS, MWIR and LWIR wavelength suitable for long range observation progress
in detector technology has provided excess also to the atmospheric window from 1.0 to 1.7 μm known as SWIR.
Independent of the chosen spectral band all applications are longing to achieve the largest observation range possible.
Thus, a concept for comparing the sensors in different wavelength bands is appreciated.
Achievable ranges are influenced in part by the atmospheric conditions and in part by the capability of the imaging
sensor, only the latter are under the control of the instrument manufacturer. In range simulation the contribution of the
sensor can be efficiently characterized by using the MRC and the MRTD concept. The minimal resolvable contrast
(MRC) as a function of spatial frequency is a decisive figure if merit for the VIS and SWIR. The minimum resolvable
temperature difference (MRTD) as a function of spatial frequency is the same for MWIR and LWIR.
All relevant sensor data are covered by MRC and MRTD, respectively, and thus can be introduced into range calculation
by simply measuring the MRC or MRTD data curves.
Based on measured MRC data range calculations for three imaging sensors (VIS, NIR and SWIR) are presented for
selected atmospheric conditions together with significant captured images.
In an electro-optical sensor suite for long range surveillance tasks the optics for the visible (450nm – 700nm) and the SWIR spectral wavelength range (900nm – 1700 nm) are combined with the receiver optics of an integrated laser range finder (LRF) .The incoming signal from the observed scene and the returned laser pulse are collected within the common entrance aperture of the optics. The common front part of the optics is a broadband corrected lens design from 450 – 1700nm wavelength range. The visible spectrum is split up by a dichroic beam splitter and focused on a HDTV CMOS camera. The returned laser pulse is spatially separated from the scene signal by a special prism and focused on the laser receiver diode of the integrated LRF. The achromatic lens design has a zoom factor 14 and F#2.6 in the visible path. In the SWIR path the F-number is adapted to the corresponding chip dimensions . The alignment of the LRF with respect to the SWIR camera line of sight can be controlled by adjustable integrated wedges. The two images in the visible and the SWIR spectral range match in focus and field of view (FOV) over the full zoom range between 2° and 22° HFOV. The SWIR camera has a resolution of 640×512 pixels. The HDTV camera provides a resolution of 1920×1080. The design and the performance parameters of the multispectral sensor suite is discussed.
Cameras for the SWIR wavelength range are becoming more and more important because of the better observation range for day-light operation under adverse weather conditions (haze, fog, rain). In order to choose the best suitable SWIR camera or to qualify a camera for a given application, characterization of the camera by means of the Minimum Resolvable Contrast MRC concept is favorable as the MRC comprises all relevant properties of the instrument. With the MRC known for a given camera device the achievable observation range can be calculated for every combination of target size, illumination level or weather conditions. MRC measurements in the SWIR wavelength band can be performed widely along the guidelines of the MRC measurements of a visual camera. Typically measurements are performed with a set of resolution targets (e.g. USAF 1951 target) manufactured with different contrast values from 50% down to less than 1%. For a given illumination level the achievable spatial resolution is then measured for each target. The resulting curve is showing the minimum contrast that is necessary to resolve the structure of a target as a function of spatial frequency. To perform MRC measurements for SWIR cameras at first the irradiation parameters have to be given in radiometric instead of photometric units which are limited in their use to the visible range. In order to do so, SWIR illumination levels for typical daylight and twilight conditions have to be defined. At second, a radiation source is necessary with appropriate emission in the SWIR range (e.g. incandescent lamp) and the irradiance has to be measured in W/m2 instead of Lux = Lumen/m2. At third, the contrast values of the targets have to be calibrated newly for the SWIR range because they typically differ from the values determined for the visual range. Measured MRC values of three cameras are compared to the specified performance data of the devices and the results of a multi-band in-house designed Vis-SWIR camera system are discussed.
The shortwave infrared spectral range (SWIR) has certain advantages for the observation during day under fog and haze
weather conditions. Due to the longer wavelength compared to the visible spectrum the range performances in the SWIR
is here considerably extended. In addition cooled SWIR focal plane arrays reach in the meantime sensitivities to be
useable for night viewing under twilight or moon light conditions.
The presented SWIR camera system combines the color imaging in the visible spectrum with the imaging in the SWIR
spectrum. The 20x zoom optics is fully corrected between 440 nm and 1700 nm. A dichroic beam splitter projects the
visible spectrum on a color chip with HDTV resolution and the SWIR spectrum on a 640x512 InGaAs focal plane array.
The open architecture of the camera system allows the use of different SWIR sensors and CMOS sensors. A universal
designed interface electronic operates the used cameras and provides standard video outputs and compressed video
streams on an ethernet interface. The camera system is designed to be integrated in various stabilized platforms. The
camera concept is described and the comparison with pure SWIR or combined SWIR / MWIR dual band cameras are
discussed from an application and system point of view.
High-resolution digital images with high refresh rates cause an enormous amount of data that must be forwarded from
the source to the recipient. This is where wireless transmission as an RF technology quickly reaches its limits. With its
high bandwidth, laser-based data transmission avoids this problem. An added benefit is a higher level of security against
eavesdropping that can be further increased through the use of quantum optical encryption techniques. For military
applications, several scenarios will be considered. Especially for the navy, communication between a ship and land for
remote forces using free space air at the eye-safe laser wavelength of 1550 nm is necessary. Data transfer at this
wavelength between ships is also important for an exchange of tactical images of the local situation. In the future, the
direct communication between a ship and a submarine through water will be required. Bug-proof and broad bandwidth
transmission of reconnaissance data will be necessary over distances of approx. several 100 m at the laser wavelength of
532 nm.
This paper will show how experiences gained through the development of optical data links from satellites to ground
stations can be used as an enabling technology for additional applications for the development of stable data connections
under atmospheric conditions.
KEYWORDS: 3D displays, Visualization, 3D image processing, Eye, Laser based displays, Raster graphics, Mirrors, 3D volumetric displays, Glasses, Modulation
In this paper, an innovative approach of a true 3D image presentation in a space filling, volumetric laser display will be described. The introduced prototype system is based on a moving target screen that sweeps the display volume. Net result is the optical equivalent of a 3D array of image points illuminated to form a model of the object which occupies a physical space. Wireframe graphics are presented within the display volume which a group of people can walk around and examine simultaneously from nearly any orientation and without any visual aids. Further to the detailed vector scanning mode, a raster scanned system and a combination of both techniques are under development. The volumetric 3D laser display technology for true reproduction of spatial images can tremendously improve the viewers ability to interpret data and to reliably determine distance, shape and orientation. Possible applications for this development range from air traffic control, where moving blips of light represent individual aircrafts in a true to scale projected airspace of an airport, to various medical applications (e.g. electrocardiography, computer-tomography), to entertainment and education visualization as well as imaging in the field of engineering and Computer Aided Design.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.