A one-day workshop was held on December 10, 2009 at the National Institute of Standards and
Technology to address the issue of data gaps in the time series of satellite measurements. Such gaps
can occur due to launch delay, launch failure, inconsistencies, or data jumps in radiometric scales
between satellites. The presence of such gaps limit the ability of using measurements to detect the
small changes in key environmental variables that result from climate change. Leading experts in
the Earth Observation community from the National Aeronautics and Space Administration
(NASA), National Oceanic and Atmospheric Adminstration (NOAA), United States Geological
Survey (USGS), and academia attended the meeting to prioritize the calibration strategies for
bridging and mitigating satellite data gaps for climate change detection. These strategies include
establishing SI traceability for satellite sensor calibration and measurements; continuing
improvements in prelaunch, onboard, and vicarious calibrations and transfer standards; establishing
celestial standards and procedures for intercomparisons; establishing SI traceability for alternative
measurement strategies, such as in-situ networks and airborne sensor campaigns; and leveraging
international satellite assets. This paper summarizes the workshop and recommendations.
The Low Background Infrared (LBIR) facility has developed and tested the components of a new detector for calibration
of infrared greater than 1 pW, with 0.1 % uncertainty. Calibration of such low powers could be valuable for the
quantitative study of weak astronomical sources in the infrared. The pW-ACR is an absolute cryogenic radiometer
(ACR) employing a high resolution transition edge sensor (TES) thermometer, ultra-weak thermal link and miniaturized
receiver to achieve a noise level of around 1 fW at a temperature of 2 K. The novel thermometer employs the
superconducting transition of a tin (Sn) core and has demonstrated a temperature noise floor less than 3 nK/Hz1/2. Using
an applied magnetic field from an integrated solenoid to suppress the Sn transition temperature, the operating
temperature of the thermometer can be tuned to any temperature below 3.6 K. The conical receiver is coated on the
inside with infrared-absorbing paint and has a demonstrated absorptivity of 99.94 % at 10.6 μm. The thermal link is
made from a thin-walled polyimide tube and has exhibited very low thermal conductance near 2x10-7 W/K. In tests with
a heater mounted on the receiver, the receiver/thermal-link assembly demonstrated a thermal time constant of about 15 s.
Based on these experimental results, it is estimated that an ACR containing these components can achieve noise levels
below 1 fW, and the design of a radiometer merging the new thermometer, receiver and thermal link will be discussed.
We present initial performance data from a cryogenic Fourier transform spectrometer (Cryo-FTS) designed for lowbackground
spectral infrared calibrations. The Cryo-FTS operates at a temperature of approximately 15 K and has been
integrated into an infrared transfer radiometer containing a calibrated Si:As blocked impurity band (BIB) detector.
Because of its low operating temperature, the spectrometer exhibits negligible thermal background signal and low drift.
Data from tests of basic spectrometer function, such as modulation efficiency, scan jitter, spectral range, spectral
resolution and sweep speed will be presented. We will also discuss calibration techniques and results pertinent to
operation of the Cryo-FTS as part of a calibration instrument, including background, signal offset and gain, and spectral
noise equivalent power. The spectrometer is presently limited to wavelengths below 25 micrometers but can be in
principle extended to longer wavelengths by replacing its KBr beamsplitter with another beamsplitter engineered for use
beyond 25 micrometers.
The Low-Background Infrared (LBIR) facility at NIST has recently completed construction of an infrared transfer
radiometer with an integrated cryogenic Fourier transform spectrometer (Cryo-FTS). This mobile system can be
deployed to customer sites for broadband and spectral calibrations of space chambers and low-background HWIL
testbeds. The Missile Defense Transfer Radiometer (MDXR) has many of the capabilities of a complete IR calibration
facility and will replace our existing filter-based transfer radiometer (BXR) as the NIST standard detector deployed to
MDA facilities. The MDXR features numerous improvements over the BXR, including: a cryogenic Fourier transform
spectrometer, an on-board absolute cryogenic radiometer (ACR), an internal blackbody reference, and an integrated
collimator. The Cryo-FTS can be used to measure high resolution spectra from 4 to 20 micrometers, using a Si:As
blocked-impurity-band (BIB) detector. The on-board ACR can be used for self-calibration of the MDXR BIB as well as
for absolute measurements of infrared sources. A set of filter wheels and a rotating polarizer within the MDXR allow for
filter-based and polarization-sensitive measurements. The optical design of the MDXR makes both radiance and
irradiance measurements possible and enables calibration of both divergent and collimated sources. Details of the
various MDXR components will be presented as well as initial testing data on their performance.
We report on initial measurements of the low-temperature thermal properties of a device that is similar to the
experimental apparatus used for absolute cryogenic radiometry (ACR) within the Low Background Infrared Radiometry
(LBIR) facility at NIST. The device consists of a receiver cavity mechanically and thermally connected
to a temperature-controlled stage through a thin-walled polyimide tube which serves as a weak thermal link.
In order to evaluate the functionality of the device for use in a cryogenic radiometer, we measured the thermal
resistance and thermal time constant of the system within the temperature range of 1.8 - 4.4 K. The measured
thermal resistance and thermal time constant at 1.883 K were 2400 ± 500 (K/mW) and 24 ± 6 (s). This value for
the thermal resistance should result in about an order-of-magnitude increase in radiometer sensitivity compared
with the present ACR within LBIR. Although the sensitivity should increase by about an order-of-magnitude,
the measured time constant is nearly unchanged with respect to previous ACRs within LBIR, due to the reduced
dimensions of the receiver cavity. Finally, the thermal conductivity inferred from the measured thermal
resistance and geometrical parameters was computed, with an average value of 0.015 (W/m-K), and compared
with other measurements of polyimide from the literature.
The pre-launch characterization and calibration of remote sensing instruments should be planned and carried out in
conjunction with their design and development to meet the mission requirements. In the case of infrared instruments, the
onboard calibrators such as blackbodies and the sensors such as spectral radiometers should be characterized and
calibrated using SI traceable standards. In the case of earth remote sensing, this allows intercomparison and
intercalibration of different sensors in space to create global time series of climate records of high accuracy where some
inevitable data gaps can be easily bridged. In the case of ballistic missile defense, this provides sensor quality assurance
based on SI traceable measurements. The recommended best practice for this pre-launch effort is presented based on
experience gained at National Institute of Standards and Technology (NIST) working with National Aeronautics and
Space Administration (NASA), National Oceanic and Atmospheric Administration (NOAA) and Department of Defense
(DoD) programs in the past two decades. Examples of infrared standards and calibration facilities at NIST for serving
the remote sensing community will be discussed.
The NIST role in supporting our Nation's climate research is described. The assembly of climate data records over
decadal time scales requires assimilating readings from a large number of optical sensors deployed in space and on the
Earth by various nations. NIST, in partnership with NASA and NOAA, develops and disseminates the calibration tools
and standards to ensure that the measurements from these sensors are accurate, comparable, and tied to international
standards based on the SI system of units. This effort helps to provide confidence that the small decadal changes in
environmental variables attributed to climate change are not an artifact of the measurement system. Additionally, it
ensures that the measurements are physics based and thus comparable to climate models.
The Low Background Infrared (LBIR) facility at the National Institute of Standards and Technology (NIST) is
responsible for absolute IR radiometric calibrations (SI traceable) in low-background temperature (below 80 K)
environments. IR radiometric test hardware that needs to be operated in cryogenic environments is calibrated in
cryogenic vacuum chambers maintained by the facility to create environments that simulate the low-temperature
background of space. Transfer radiometers have also been developed to calibrate IR radiometric test hardware this is too
large to ship to NIST from their own IR test facilities. The first generation transfer radiometer, the BXR, is a filter-based
radiometer that uses an As-doped Si Blocked Impurity Band detector, and can calibrate IR test chambers to a total
uncertainty of less than 3 % (1 σ ) at powers as low as to 10-14 W/cm2. The BXR has evaluated 9 chambers and the
performance of a subset of these chambers will be discussed to a limited extent to demonstrate the need for calibrating
IR test chambers. The second generation transfer radiometer, the MDXR, and new primary standards allowing absolute
calibrations as low as 10-15 W/cm2 are in the final stages of development. The MDXR will have all the functionality of
the BXR and it will have a cryogenic Fourier transform spectrometer (FTS) for high resolution spectral capability.
Performance specifications and test results from development activity on the new primary standards will be discussed.
The state-of-the-art electro-optical sensors being designed for today's space-based environmental applications require a complete characterization and thorough calibration. This is especially true for sensors designed to assess global climate change, which require very small uncertainties. This paper describes a system-level approach that addresses each phase of calibration, from planning to on-orbit operations. This approach encourages early planning and continuity of effort throughout the lifetime of the project (pre- and post-flight) to promote an optimum calibration approach that will minimize uncertainty for the intended application. This paper also discusses considerations for component level characterization, ground calibration and standards, in-flight calibration sources and trending, and in-flight validation assessment.
The full potential of current remote sensor technology is limited by the inability to correct biases once an exo-atmospheric remote sensor becomes operational. Even when the calibration is traced to the International System of Units, SI, and the instrument is performing within the operational envelope wherein it is calibrated, the problem exists and a Space Metrology Program is a potential solution to the problem. This paper discusses such a program, suggests a feasibility study to address the issues and recommends a plan of action.
Any operational instrument has a bias and reducing the magnitude of the bias can only be accomplished when an adequately accurate standard is accessible by the instrument while the instrument is in its operational environment. Currently the radiometric flux from the sun, the moon and the stars is inadequately accurate SI to provide a standard that is consistent with the remote sensor state-of-the-art technology. The result is data that is less accurate than it could be often leading to confusing and conflicting conclusions drawn from that data. Planned remote sensors such as those required to meet future program needs (e.g. the United States National Polar-Orbiting Operational Environmental Satellite System (NPOESS) and the proposed international Global Earth Observation Program) are going to need the higher accuracy radiometric standards to maintain their accuracy once they become operational. To resolve the problem, a set of standard radiometers on the International Space Station is suggested against which other exo-atmospheric radiometric instruments can be calibrated. A feasibility study for this program is planned.
An adiabatic laser calorimeter has been developed with a sensitivity of the order of 10-6cm-1 with one watt of laser power using a CO2 laser (9(mu) m to 11(mu) m) in the infrared region. The heat leak by conduction and by radiation from a sample to an inner isothermal enclosure is enough small to be ignored because we succeeded in developing the temperature tracking system between the sample and the enclosure. The total uncertainty of absorption-coefficient measurements is estimated to be 5.4%. The absorption coefficient of a potassium chloride sample #2 was (3.17+/- 0.18)X10-3cm-1.
An infrared transfer radiometer has been recently developed at the Low-Background Infrared Calibration (LBIR) facility at the National Institute of Standards and Technology (NIST) for the Ballistic Missile Defense Organization (BMDO) program. The BMDO Transfer Radiometer (BXR) is designed to measure the irradiance of a collimated source of infrared light having an angular divergence of less than 1 mrad. It is capable of measuring irradiance levels as low as 10-15 W/cm2 over the spectral range from 2 micrometer to 30 micrometer. The radiometer uses an arsenic-doped silicon blocked impurity band (BIB) detector operated at temperatures below 12 K. Spectral resolution is provided by narrow bandpass interference filters and long-wavelength blocking filters. All the components of the radiometer, which include a mechanical shutter, an internal calibration source and detector, a long baffle section, a spatial filter, two filter wheels and a two- axis detector stage are cooled with an active flow of liquid helium to maintain temperatures below 20 K. A cryogenic vacuum chamber has been built to house the radiometer and to provide mechanical tilt alignment to the source. The radiometer is easily transported to a user site along with its support equipment.
Collimated infrared sources covering the 2 micrometer to 30 micrometer range of wavelengths are necessary to simulate infrared radiation from distant objects. This is important because on-orbit servo and tracking systems make extensive use of infrared radiation for remote sensing. Collimators are used to calibrate infrared detectors in terms of absolute power within a given spectral range. The National Institute of Standards and Technology (NIST) operates and maintains the Low Background Infrared Calibration (LBIR) facility, which uses a 2 K electrical substitution radiometer, the Absolute Cryogenic Radiometer (ACR), that is the primary national standard for broadband and infrared spectral measurements. At this facility, users can calibrate blackbody sources with at most 1% uncertainty. However, users must then rely on optical systems at their own facility to collimate the radiation from the blackbody. The effect of the optics on the output of the beam must then be calculated from models. For this reason, NIST is developing a portable transfer radiometer (BXR) that can be taken onsite to directly measure the spectral output, thus eliminating intermediate steps in the calibration chain. NIST is also developing a source having 1 cm diameter collimated beam, for a preliminary calibration of the BXR at the LBIR facility from 2 micrometer to 8 micrometer. The source must fit into a volume of about 0.03 m3 (1 cubic foot), have an angular divergence of less than 700 (mu) rad, a power output greater than 10 nW, and demonstrate 1% repeatability or better. The development and characterization of this source is the main topic of this paper.
The capability of the Low Background Infrared (LBIR) facility at the National Institute of Standards and Technology to spectrally calibrate infrared detectors was demonstrated with the spectral calibration of arsenic doped silicon blocked impurity band (BIB) detectors. The BIB detectors were calibrated over the 2 micrometer to 30 micrometer range, using light from a monochromator with a nominal 2% bandwidth. Photon fluxes used for the calibration ranged from 1013 photons/s/cm2 to 1014 photons/s/cm2. The large area detectors (10 mm2) calibrated in this paper were very linear up to 2.5 X 1014 photons/s/cm2, where they showed a 1% drop in signal from linearity. The calibrations contained less than 6 1% standard component of random noise uncertainty, and there was about a 6 5% standard component of uncertainty arising from systemic effects that will be discussed in detail. The calibrations were performed in ultra- high vacuum in a 20 K background environment by making direct intercomparisons between the power measured by an absolute cryogenic radiometer and the response measured by a detector irradiated by the same beam. A detailed description of measurement methodology and system apparatus is given. Detector linearity and uniformity are also discussed. The LBIR facility can now provide calibrated BIB detectors as transfer standards as well as evaluate and calibrate customer's large area detectors and detector arrays provided the detectors stay within certain physical limitations.
This work surveys techniques to measure the absorption coefficient of low absorption materials. A laser calorimeter is being developed with a sensitivity goal of (1 +/- 0.2)X 10-5 cm-1 with one watt of laser power using a CO2 laser (9 (mu) m to 11 (mu) m), a CO laser (5 (mu) m to 8 (mu) m), a He-Ne laser (3.39 (mu) m), and a pumped OPO tunable laser (2 (mu) m to 4 (mu) m) in the infrared region. Much attention has been given to the requirements for high sensitivity and to sources of systematic error including stray light. Our laser calorimeter is capable of absolute electrical calibration. Preliminary results for the absorption coefficient of highly transparent potassium chloride (KCl) samples are reported.
This work addresses the issues related to the radiometric, or photometric, accuracy of Fourier- transform infrared (FT-IR) spectrophotometers. In particular, the sample induced error in the radiometric measurements is investigated. Measurement results for the transmittance of a set of optical filters are presented. The optical density (OD) of these filters ranges from 0.3 to 5 between 2 and 25 micrometers wavelengths. Future research in the direction of improving the radiometric accuracy is summarized.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.