A major problem in obtaining FAA approval for infrared EVS-based operations under poor-visibility conditions is the
lack of correlation between runway visible range and IR attenuation or range. The "IR advantage" in fog, although often
substantial, varies greatly as a function of detailed droplet-size distribution. Real-time knowledge of the IR extinction at
a given destination is key to reliable operations with lower decision heights. We propose the concept of a Runway
Infrared Range (RIRR), to be measured by a ground-based IR transmissometer. Although RVR determination now
utilizes single-point scatterometry, the very (Mie) scattering mechanism that often affords a significant IR range
advantage necessitates a return to two-point transmissometry. As an adaptation of RVR technology, RIRR will include
separate determinations of background-scene and runway/approach lights ranges, respectively. The latter algorithm,
known as Allard's Law, will encompass background level,
light-settings, visible extinction, and camera performance
(usually at short-wave IR). The assumptions and validity of this RIRR parallel those for the traditional RVR. Also,
through extended monitoring at a hub, the RIRR may be inexpensively surveyed throughout a fog season, thus predicting
the economic benefits of IR-based EVS for that site.
The acquisition of approach and runway lights by an imager is critical to landing-credit operations with EVS.
Using a GPS clock, LED sources are pulsed at one-half the EVS video rate of 60 Hz or more. The camera then
uses synchronous (lock-in) detection to store the imaged lights in alternate frames, with digital subtraction of the
background for each respective frame-pair. Range and weather penetration, limited only by detector background
shot-noise (or camera system noise at night), substantially exceed that of the human eye. An alternative is the use
of short-wave infrared cameras with eyesafe laser diode emitters. Also, runway identification may be encoded on
the pulses. With standardized cameras and emitters, an "instrument qualified visual range" may be established.
The concept extends to portable beacons at austere airfields, and to see-and-avoid sensing of other aircraft
including UAVs.
An effective enhanced vision system must operate over a broad spectral range in order to offer a pilot an optimized scene that includes runway background as well as airport lighting and aircraft operations. The large dynamic range of intensities of these images is best handled with separate imaging sensors. The EVS 2000 is a patented dual-band Infrared Enhanced Vision System (EVS) utilizing image fusion concepts. It has the ability to provide a single image from uncooled infrared imagers combined with SWIR, NIR or LLLTV sensors. The system is designed to provide commercial and corporate airline pilots with improved situational awareness at night and in degraded weather conditions but can also be used in a variety of applications where the fusion of dual band or multiband imagery is required. A prototype of this system was recently fabricated and flown on the Boeing Advanced Technology Demonstrator 737-900 aircraft. This paper will discuss the current EVS 2000 concept, show results taken from the Boeing Advanced Technology Demonstrator program, and discuss future plans for the fusion system.
An effective enhanced vision system must operate over a broad spectral range in order to offer a pilot an optimized scene that includes runway background as well as airport lighting and aircraft operations. The large dynamic range of intensities of these images is best handled with separate imaging sensors. The EVS 2000 is a patented dual-band Infrared Enhanced Vision System (EVS) utilizing image fusion concepts to provide a single image from uncooled infrared imagers in both the LWIR and SWIR. The system is designed to provide commercial and corporate airline pilots with improved situational awareness at night and in degraded weather conditions. A prototype of this system was recently fabricated and flown on the Boeing Advanced Technology Demonstrator 737-900 aircraft. This paper will discuss the current EVS 2000 concept, show results taken from the Boeing Advanced Technology Demonstrator program, and discuss future plans for EVS systems.
In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.
The 1997 Final Report of the 'White House Commission on Aviation Safety and Security' challenged industrial and government concerns to reduce aviation accident rates by a factor of five within 10 years. In the report, the commission encourages NASA, FAA and others 'to expand their cooperative efforts in aviation safety research and development'. As a result of this publication, NASA has since undertaken a number of initiatives aimed at meeting the stated goal. Among these, the NASA Aviation Safety Program was initiated to encourage and assist in the development of technologies for the improvement of aviation safety. Among the technologies being considered are certain sensor technologies that may enable commercial and general aviation pilots to 'see to land' at night or in poor visibility conditions. Infrared sensors have potential applicability in this field, and this paper describes a system, based on such sensors, that is being deployed on the NASA Langley Research Center B757 ARIES research aircraft. The system includes two infrared sensors operating in different spectral bands, and a visible-band color CCD camera for documentation purposes. The sensors are mounted in an aerodynamic package in a forward position on the underside of the aircraft. Support equipment in the aircraft cabin collects and processes all relevant sensor data. Display of sensor images is achieved in real time on the aircraft's Head Up Display (HUD), or other display devices.
A series of tests were conducted to assess the feasibility and performance of a fixed-field, infrared landing monitor system, located on the runway. The resultant images are used to provide enhanced vision for ground personal, in contrast to the more traditional enhanced vision for the flight crew. This paper describes the architecture and design of a dithered 320 by 240 MWIR InSb infrared camera, along with qualitative performance and test results. Issues associated with SWIR/MWIR bandpass selection, FPA type and atmospheric penetration are discussed as well as resolution requirements. Images of aircraft landing on an aircraft carrier are included for illustrative purposes.
Infrared sensors at the nominal 8 - 12 and 3 - 5 micron wavebands respectively can be shown to have complementary performance characteristics when used over a range of meteorological conditions. The infrared/optical multisensor for the autonomous landing guidance system integrates staring longwave, midwave, and visible sensors into an environmentally sealed and purged assembly. The infrared modules include specific enhancements for the detection of runways under adverse weather conditions. The sensors incorporate pixel-for-pixel overlap registration, and the fields of view match a conformal head-up display with sensor/display boresighting to within a fraction of a pixel. Tower tests will be used to characterize the sensors and gather data to support simulation and image processing efforts. After integration with other elements of the autonomous landing guidance system, flight tests will be conducted on Air Force and commercial transport aircraft. In addition to display and analog video recording, the multisensor data will be digitally captured during critical flight test phases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.