KEYWORDS: Nonuniformity corrections, Projection systems, High dynamic range imaging, Mid-IR, Temperature metrology, Optical resolution, Resistance, Infrared radiation, Light emitting diodes, Control systems
Achieving very high apparent temperatures is a persistent goal in infrared scene projector (IRSP) design. Several
programs are currently under way to develop technologies for producing high apparent temperatures. Producing a
useful system capable of reproducing high fidelity scenes across a large range of apparent temperatures requires more
than just a high temperature source. The entire scene projection system must support the extended dynamic range of
the desired scenarios. Supporting this extended range places requirements on the rest of the system. System
resolution and non-uniformity correction (NUC) are two areas of concern in the development of a high dynamic range
IRSP. We report the results of some initial investigations into the resolution required for acceptable system
performance and the effects of moving to a higher dynamic range may put on existing NUC procedures.
Polarization is increasingly being considered as a method of discrimination in passive sensing applications. In this paper
the degree of polarization of the thermal emission from the emitter arrays of two new Santa Barbara Infrared (SBIR)
micro-bolometer resistor array scene projectors was characterized at ambient temperature and at 77 K. The emitter
arrays characterized were from the Large Format Resistive Array (LFRA) and the Optimized Arrays for Space-Background Infrared Simulation (OASIS) scene projectors. This paper reports the results of this testing.
Testing of two-color imaging sensors often requires precise spatial alignment, including correction of distortion in the optical paths, beyond what can be achieved mechanically. Testing, in many cases, also demands careful radiometric calibration, which may be complicated by overlap in the spectral responses of the two sensor bands. In this paper, we describe calibration procedures used at the Air Force Research Laboratory hardware-in-the-loop (HWIL) facility at Eglin AFB, and present some results of recent two-color testing in a cryo-vacuum test chamber.
Spatial distortion effects in infrared scene projectors, and methods to correct them, have been studied and reported in several recent papers. Such effects may be important when high angular fidelity is required of a projection test. The modeling and processing methods previously studied, though effective, have not been well suited for real-time implementation. However, the “spatial calibration” must be achieved in real-time for certain testing requirements. In this paper we describe recent efforts to formalize and implement real-time spatial calibration in a scene projector test. We describe the effect of the scene generation software, “distortion compensation”, the projector, the sensor, and sensor processing algorithms on the transfer of spatial quantities through the projection system. These effects establish requirements for spatial calibration. The paper describes the hardware and software recently developed at KHILS to achieve real-time spatial calibration of a projection system. The technique extends previous efforts in its consideration of implementation requirements, and also in its explicit treatment of the spatial effects introduced by each of the distinct components of the overall system, as mentioned above.
One proven technique for nonuniformity correction (NUC) of a resistor array infrared scene projector requires careful measurement of the output-versus-input response for every emitter in a large array. In previous papers, we have discussed methods and results for accomplishing the projector NUC. Two difficulties that may limit the NUC results are residual nonuniformity in the calibration sensor, and nonlinearity in the calibration sensor's response to scene radiance. These effects introduce errors in the measurement of the projector elements' output, which lead to residual nonuniformity. In this paper we describe a recent effort to mitigate both of these problems using a procedure that combines sensor nonuniformity correction and sensor calibration, detector by detector, so that these problems do not contaminate the projector NUC. By measuring a set of blackbody flood-field images at a dozen or so different temperatures, the individual detector output-versus-input radiance responses can be measured. Similar to the projector NUC, we use a curve-fitting routine to model the response of each detector. Using this set of response curves, a post-processing algorithm is used to correct and calibrate the images measured by the sensor. We have used this approach to reduce several sensor error sources by a factor of 10 to 100. The resulting processing is used to correct and calibrate all of the sensor images used to perform the projector NUC, as one step in the projector NUC. The procedure appears to be useful for any application where sensor nonuniformity or response nonlinearities are significant.
Infrared projection systems based on resistor arrays typically produce radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. This makes it possible to test infrared sensors with spectral responsivity anywhere in this range. Two resistor-array projectors optically folded together can stimulate the two bands of a 2-color sensor. If the wavebands of the sensor are separated well enough, it is possible to fold the projected images together with a dichroic beam combiner (perhaps also using spectral filters in front of each resistor array) so that each resistor array independently stimulates one band of the sensor. If the wavebands are independently stimulated, it is simple to perform radiometric calibrations of both projector wavebands. In some sensors, the wavebands are strongly overlapping, and driving one of the resistor arrays stimulates both bands of the unit-under-test (UUT). This “coupling” of the two bands causes errors in the radiance levels measured by the sensor, if the projector bands are calibrated one at a time. If the coupling between the bands is known, it is possible to preprocess the driving images to effectively decouple the bands. This requires performing transformations, which read both driving images (one in each of the two bands) and judiciously adjusting both projectors to give the desired radiance in both bands. With this transformation included, the projection system acts as if the bands were decoupled - varying one input radiance at a time only produces a change in the corresponding band of the sensor. This paper describes techniques that have been developed to perform radiometric calibrations of spectrally coupled, 2-color projector/sensor systems. Also presented in the paper are results of tests performed to demonstrate the performance of the calibration techniques. Possible hardware and algorithms for performing the transformation in real-time are also presented.
For many types of infrared scene projectors, differences in the outputs of individual elements are one source of error in projecting a desired radiance scene. This is particularly true of resistor-array based infrared projectors. Depending on the sensor and application, the desired response uniformity may prove difficult to achieve. The properties of the sensor used to measure the projector outputs critically affect the procedures that can be used for nonuniformity correction (NUC) of the projector, as well as the final accuracy achievable by the NUC. In this paper we present a description of recent efforts to perform NUC of an infrared projector under “adverse” circumstances. For example, the NUC sensor may have some undesirable properties, including: significant random noise, large residual response nonuniformity, temporal drift in bias or gain response, vibration, and bad pixels. We present a procedure for reliably determining the output versus input response of each individual emitter of a resistor array projector. This NUC procedure has been demonstrated in several projection systems at the Kinetic Kill Vehicle Hardware-In-the-Loop Simulator (KHILS) including those within the KHILS cryogenic chamber. The NUC procedure has proven to be generally robust to various sensor artifacts.
An unexpected effect was observed in a data set recently measured at the Kinetic Kill Vehicle Hardware-in-the-loop Simulator (KHILS) facility. A KHILS projector was driven to illuminate a contiguous block of emitters, with all other emitters turned off. This scene was measured with a two-color IR sensor. A sequence of 100 images was recorded, and certain statistics were computed from the image sequence. After measuring and analyzing these images, a “border” was observed with a particularly large standard deviation around the bright rectangular region. The pixels on the border of the region were much noisier than either inside or outside of the bright region. Although several explanations were possible, the most likely seemed to be a small vibration of either the sensor or projector. The sensor, for example, uses a mechanical cyro-cooler, which produces a vibration that can be felt by hand. Further analyses revealed an erratic motion of the position of objects in the image with amplitude of a few tents of the detector pitch. This small motion is sufficient to produce large fluctuations in the image pixel values in regions that have a large radiance gradient - such as suggest that the standard deviation of a “block image” sequence is easy to compute and will show the characteristic effect in the presence of image motion as small as a fraction of the detector pitch.
In some of its infrared projection systems, the Kinetic Kill Vehicle Hardware-In-the-Loop Simulator (KHILS) facility uses two 512 x 512 Wideband Infrared Scene Projector (WISP) resistor arrays to stimulate two different camera wavebands at the same time. The images from the two arrays are combined with a dichroic beam combiner, allowing the two camera bands to be independently stimulated. In early tests it was observed that the projector bands were not completely independent. When one array was projecting, the projected pattern could be seen in the opposite camera band. This effect is caused by spectral “crosstalk” in the camera/projector system. The purpose of this study was to build a mathematical model of the crosstalk, validate the model with measurements of a 2-color projection system, and then use the model as a tool to determine the spectral characteristics of filters that would reduce the crosstalk. Measurements of the crosstalk were made in the KHILS 2-color projector with two different 2-color cameras. The KHILS Quantum Well Infrared Photodetector (QWIP) Mid-Wave (MW)/Long-Wave (LW) camera and the Army Research Laboratory HgCdTe (HCT) MW/LW camera were used in the tests. The model was used to analyze the measurements, thus validating the model at the same time. The model was then used to describe conceptual designs of new 2-color projection configurations, enabling a prediction of crosstalk in the system, and selection of filters that would eliminate the crosstalk.
The effects of distortion in the complex optical system of an IR scene projector have motivated the development of methods for spatial calibration for scene projectors. A typical method utilizes the projection of a set of test images, with careful measurement of the location of points in the image. Given the projected and measured positions, a parametric model is used to describe the spatial “distortion” of the projection system. This distortion model can then be used for a variety of purposes, including pre-processing the images to be projected so that the distortion of the projection system is pre-compensated, and the distortion of the projection system is negated. This application and specific method have been demonstrated, and can compensate for a variety of distortion and alignment effects in the projector / sensor configuration. Personnel at the Kinetic Kill Vehicle Hardware-in-the-loop Simulator (KHILS) facility have demonstrated compensation and co-alignment of 2-color projection systems with sub-pixel precision using this technique. This paper describes an analysis of a situation in which pre-compensated images are translated (either mechanically or optically) to simulate motion of a target object or adjust alignment of the sensor and projector. The effect of physically translating images that had been pre-compensated for a different projector/sensor alignment was analyzed. We describe the results of a study of the translation and distortion effects, and characterize the expected performance of a testing procedure that requires translation of the pre-compensated images.
As discussed in a previous paper to this forum, optical components such as collimators that are part of many infrared projection systems can lead to significant distortions in the sensed position of projected objects versus their true position. The previous paper discussed the removal of these distortions in a single waveband through a polynomial correction process. This correction was applied during post-processing of the data from the infrared camera-under-test. This paper extends the correction technique to two-color infrared projection. The extension of the technique allows the distortions in the individual bands to be corrected, as well as providing for alignment of the two color channels at the aperture of the camera-under-test. The co-alignment of the two color channels is obtained through the application of the distortion removal function to the object position data prior to object projection.
This paper describes a simulation and analysis of a sensor viewing a 'pixelized' scene projector like the KHILS' Wideband Infrared Scene Projector (WISP). The main objective of this effort is to understand and quantify the effects of different scene projector configurations on the performance of several sensor signal processing algorithms. We present simulation results that quantify the performance of two signal processing algorithms used to estimate the sub-pixel position and irradiance of a point source. The algorithms are characterized for different signal-to-noise ratios, different projector configurations, and two different methods for preparing images that drive the projector. We describe the simulation in detail, numerous results obtained by processing simulated images, algorithms and projector properties, and present conclusions.
This paper discusses the implementation and evaluation of several different algorithms for image superresolution (SR). Such processing is of interest in many imaging situations where resolution is limited by range, wavelength, aperture size, detector size, or other physical or practical constraints. A relevant example is the application of improved resolution to passive millimeter wave imaging sensors for munitions systems. In this paper, we refer to superresolution as processing which recovers spatial frequency components of a measured image that are completely suppressed by the image formation process. We demonstrate performance of several iterative algorithms, and discuss several aspects of the implementation and evaluation of SR processing.
This paper presents an analysis of spatial blurring and sampling effects for a sensor viewing a pixelized scene projector. It addresses the ability of a projector to simulate an arbitrary continuous radiance scene using a field of discrete elements. The spatial fidelity of the projector as seen by an imaging sensor is shown to depend critically on the width of the sensor MTF or spatial response function, and the angular spacing between projector pixels. Quantitative results are presented based on a simulation that compares the output of a sensor viewing a reference scene to the output of the sensor viewing a projector display of the reference scene. Dependence on the blur of the sensor and projector, the scene content, and alignment both of features in the scene and sensor samples with the projector pixel locations are addressed. We attempt to determine the projector characteristics required to perform hardware-in-the-loop testing with adequate spatial realism to evaluate seeker functions like autonomous detection, measuring radiant intensities and angular positions or unresolved objects, or performing autonomous recognition and aimpoint selection for resolved objects.
In a series of measurements made to characterize the performance of a Wideband Infrared Scene Projector (WISP) system, timing artifacts were observed in one set of tests in which the projector update was synchronized with the camera readout. The projector was driven with images that varied from frame to frame, and the measured images were examined to determine if they varied from frame to frame in a corresponding manner. It was found that regardless of the relative time delay between the projector update and sensor readout, each output image was a result of two input images. By analyzing the timing characteristics of the camera integration scheme and the WISP update scheme it was possible to understand effects in the measured images and simulate images with the same effects. This paper describes the measurements and the analyses. Although the effects were due to the unique camera integration and readout scheme, the effects could show up when testing other sensors. Thus also presented in this paper are techniques for testing with resistive array projectors, so that the timing artifacts observed with various kinds of cameras are minimized or eliminated.
A challenging problem associated with performing hardware- in-the-loop tests of imaging infrared seekers is projecting images that are spatially realistic. The problem is complicated by the fact that the targets may be small and unresolved at acquisition and grow to fill the field of view during the final guidance updates. Although characteristics of the projection system are usually thought of as determining the spatial realism, the imagery used to drive the projector is also important. For a pixelized projector, the driving imagery must be sampled at a rate determined by the sample spacing of the pixels in the projector. If the scenes contain important information that is small compared to the projector pixel spacing (that is, if they have important information at high spatial frequencies), then information may be lost in the sampling process if the images are not adequately bandlimited. This bandlimiting can be accomplished by prefiltering the scenes. At acquisition, targets are usually small; thus, prefiltering is necessary to preserve information about the target. Without such prefiltering, for example, infinitesimally small targets would never be seen unless they just happened to be at the exact location where the scene is sampled for a projector pixel. This paper reports the results of a study of various filters that might be used for prefiltering synthetic imagery generated to drive projectors in the KHILS facility. Projector and seeker characteristics typical of the KHILS facility were adopted for the study. Since the radiance produced by projectors is always positive, filters that can produce negative values were not considered. Figures of merit were defined based on the sensor-measured quantities such as radiant intensity, centroid, and spot size. The performance of prefilters of various shapes and sizes and for typical projector and seeker characteristics will be reported.
The degree to which hardware-in-the-loop tests can be used to replace more expensive flight tests is dependent on how well the tests resemble real flight tests. One of the most challenging problems associated with making realistic hardware-in-the-loop tests is the projection of realistic imagery to the seeker. Since a seeker is limited in its ability to `see' a real scene, projection systems do not have to perfectly replicate real scenes. They only have to produce scenes which appear the same as the real scenes when measured with spatial, spectral, and temporal resolutions that are at least as poor as those of the seekers to be tested. Unfortunately, this means that in order to determine the realism of a given test or class of tests, it is necessary to include in the analysis characteristics of the seekers as well as characteristics of both the real scenes and the projected scenes. For many reasons, the conventional Fourier transform techniques are not adequate for performing these analyses. In this paper, a formalism is given for analyzing spatial, spectral, and temporal effects in a hardware-in-the-loop system involving a pixelized projector and a passive imaging sensor. The fundamental equations are presented describing the measurement of either a real scene or a pixelized projector with a passive imaging sensor. The equations are kept in the space, wavelength, and time domains to avoid the unnecessary restrictions that are encountered when transforming to the Fourier domain. An example is given of an application of the formalism to evaluate the effects of projector pixel spacing and blur effects.
Hardware-in-the-loop (HWIL) simulation combines functional hardware with digital models. This technique has proven useful for test and evaluation of guided missile seekers. In a nominal configuration, the seeker is stimulated by synthetic image data. Seeker outputs are passed to a simulation control computer that simulates guidance, navigation, control, and airframe response of the missile. The seeker can be stimulated either by a projector or by direct signal injection (DSI). Despite recent advancements in scene projection technology, there are practical limits to the scenes produced by a scene projector. Thus, the test method of choice is often DSI. This paper discusses DSI techniques for HWIL. In this mode, sensor hardware is not used; scene signature data, provided directly to the seeker signal processor, is computed and sensor measurement effects are simulated. The computed images include sensor effects such as blurring, sampling, detector response characteristics, and noise. This paper discusses DSI methods for HWIL, with specific applications at the Air Force Kinetic Kill Vehicle Hardware-in-the-loop Simulator facility.
This paper addresses the process of measuring the output of individual elements of a pixelized scene projector. The in-band scene projector is a key component of a sensor/seeker test facility such as the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) at Eglin AFB, Florida. Analyses are presented which quantify errors associated with measuring the radiant intensity of individual pixels on a scene projector. The errors are broken down into sampling errors, truncation errors, and random measurement noise. The magnitude of each error source is determined as a function of parameters of the projector and sensor such as the element spacings, and blur. Guidelines for using this information to accurately and efficiently perform nonuniformity correction of a scene projector are presented.
KEYWORDS: Sensors, Projection systems, Signal to noise ratio, Interference (communication), Signal detection, Sensor performance, Error analysis, Signal processing, Staring arrays, Analytical research
This paper examines the relative significance of and dependencies between different noise sources which affect a sensor viewing a scene projector. An analysis is presented which compares the effect of various signal-dependent and signal-independent sensor noises, as well as the effect of projector nonuniformity (NU) on the sensor output. A key result of this analysis is a quantitative means to assess the importance of projector NU on sensor performance for different scene levels and operating conditions. It provides an analytical means to address questions such as: How and how much does projector NU influence the sensor response? At what level does the projector NU become a limiting factor in testing sensor performance? What is the penalty incurred in a particular test if a specified projector NU is not achieved? What effort should be expended to reduce projector NU in relation to other errors for a particular application? Discussion, results, and conclusions for a specific application are presented in addition to the analyses.
Simulation methods offer a time- and cost-effective approach to the evaluation and testing of guided missiles. These include hardware-in-the-loop as well as all-digital simulations which provide information about how a particular existing or proposed missile system might perform in hypothetical situations which may not be practically duplicated in reality. This paper describes an all-digital simulation developed using available components. The paper describes the functional flow of the simulation, and identifies the information which is passed between the independent modules. Several applications are described, and results such as intercept miss distances, line-of-sight pointing error statistics, and sensitivity to certain system parameters are demonstrated.
Hardware-in-the-loop (HWIL) testing can be used as an efficient and effective means for analyzing the performance of guided missile systems. Due to the limits of current technologies, components of the simulation are limited in their capability to simulate real-world conditions for certain test articles. One component which is critical in an HWIL system for strategic guided missiles is the scene projection or delivery device. To stimulate imaging JR sensors, this scene projector (SP) typically consists of a pixelized in-band source which can be modulated both spatially and temporally to simulate the radiane scene which would be observed during an actual engagement. The SP is driven by a scene generator which provides scene radiance information to the SP under control of a simulation computer, which determines the field-of-view (FOV) composition based on a simulated engagement. In using such a system, a primary concern is that the SP is able to create a scene which produces the proper response in the observing sensor. Another effect which bears examination is the SFs projection method, such as scanning an in-band source to cover the projection FOV. The detailed interaction between the modulated source and the timing of the sensor's detection, integration, and readout processes may cause unrealistic or unexpected sensor behavior. In order to assess the compatibility of a specific sensor viewing a specific SP, a detailed simulation has been developed by Nichols Research Corporation under the direction of the Guided Interceptor Technology Branch (WL/MNSI) of the USAF Wright Laboratory Armament Directorate. This simulation was designed primarily to address issues related to scene projector usage in the Kinetic Kill Vehicle Hardware in the Loop Simulator (KHILS) facility at Eglin AFB, Florida. The simulation allows the user to define: the spatial response of the sensor; the spatial properties of the SP (i.e. the radiance distribution arising from a commanded impulse); the illumination timing of the SP, such as scan format, persistence, etc.; and the integration and readout timing of the sensor. Given sampled values of these response functions, and sampled values of the desired radiance scene, the SP simulation computes the detector outputs in the form of a sensed image. This output image can help to assess the suitability of using the modeled SP for testing the modeled sensor by illustrating potential mismatches. It also provides a means to predict the performance to be expected from this module of the HWIL simulation for a particular test scenario. This paper derives equations which express the sensor output as a function of the input scene, the spatial and temporal response functions of the sensor and the SP, and the spectral response functions of the sensor and SP. Assumptions which affect the implementation and the generality of application are stated and discussed. Results and conclusions are presented for a specific application which illustrate the utility of the simulation
This paper describes the capabilities, models, and implementation of the SSW sensor model software, and illustrates its utility in processing computer-generated signatures. Sample images illustrate the results of processing computed images with different components of the 55W sensor model. Synthetic scene modeling and signature generation have become important tools used in the development of complex sensor systems for smart weapons. Simulated signatures have proven useful by providing realistic data to support system development, performance prediction, validation, and trade studies for signal processing applications and entire systems. In addition, comparisons between computed signatures and measured data can provide insight into signature phenomenology and modeling. Standard signature prediction codes do not account for effects caused by the sensor. These effects can cause measured signatures to differ significantly from predictions, and may critically affect the performance of applications which use the data. The Strategic Scene Workstation (55W), developed by Nichols Research Corporation for the USAF Wright Laboratory, Armament Directorate, includes a computer model designed to simulate sensor effects in computed signatures. The 55W sensor model simulates spatial effects, noise, and detector characteristics typical if passive sensors used in strategic applications. This function is necessary to custon ze predicted signatures, and has been used effectively to enhance the realism and accuracy of simulated signatures for applications including hardware in the loop simulation at the USAF KHILS facility at Eglin AFB, FL.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.