In this paper we explore the use of digital holography with a high-speed camera to sense and correct atmospheric turbulence. Deep turbulence, a type of atmospheric turbulence, degrades the performance of both imaging and directed-energy systems. Characterizing and correcting atmospheric turbulence requires knowledge of the induced phase errors on the wavefront. Digital holography provides the capability to measure phase errors in the most challenging atmospheric conditions. Previous laboratory experiments at the United States Air Force Academy have demonstrated both sensing and correction of simulated atmospheric turbulence at discrete planes using digital holography. In this work, we design and test a digital holography system capable of imaging in relevant atmospheric conditions. This paper expands the optical design of the existing laboratory-based digital holography system to be more capable of accurately measuring real-world turbulence because it uses a high-speed camera. Our results detail the performance of the digital holography system in a controlled test environment and provide data about the feasibility of integrating digital holography into future fielded systems.
Neuromorphic cameras, or Event-based Vision Sensors (EVS), operate in a fundamentally different way than conventional frame-based cameras. Their unique operational paradigm results in a sparse stream of high temporal resolution output events which encode pixel-level brightness changes with low-latency and wide dynamic range. Recently, interest has grown in exploiting these capabilities for scientific studies; however, accurately reconstructing signals from the output event stream presents a challenge due to physical limitations of the analog circuits that implement logarithmic change detection. In this paper, we present simultaneous recordings of lightning strikes using both an event camera and frame-based high-speed camera. To our knowledge, this is the first side-by-side recording using these two sensor types in a real-world scene with challenging dynamics that include very fast and bright illumination changes. Our goal in this work is to accurately map the illumination to EVS output in order to better inform modeling and reconstruction of events from a real-scene. We first combine lab measurements of key performance metrics to inform an existing pixel model. We then use the high-speed frames as signal ground truth to simulate an event stream and refine parameter estimates to optimally match the event-based sensor response for several dozen pixels representing different regions of the scene. These results will be used to predict sensor response and develop methods to more precisely reconstruct lightning and sprite signals for Falcon ODIN, our upcoming International Space Station neuromorphic sensing mission.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.