Modern command and control systems depend on surveillance subsystems to form an overall tactical pictures. The use of sensors with different capabilities can improve the quality of the aggregate picture. However, the quality of the fused data is highly dependent on the quality of the data that is supplied to the fusion processor. Before the fusion process takes place, sensor data has to be transformed to a common reference frame. Since each individual sensor's data may be biased, a prerequisite for successful data fusion is the removal of the bias errors contained in the data from all contributing sensors. In this paper, a technique is developed to perform absolute sensor alignment (the removal of bias errors) using information from moving objects, such as low earth orbit satellites, that obey Kepler's laws of motion.
|