PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This PDF file contains the front matter associated with SPIE Proceedings Volume 12693, including the Title Page, Copyright information, Table of Contents, and Conference Committee information.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Airborne scattering media, such as fog, obstructs imaging by destroying the one-to-one object to image mapping relationship through a multiple scattering process. Existing research in this area is predicated on developing methods to image despite this scattering media, and relatively little attention has been paid to the reverse imaging goal—of using scattering media to intentionally obstruct imaging in a controlled manner. In this work we are investigating how a plume of engineered aerosol can create an asymmetric vision environment— in which ease of imaging tasks are dependent on view direction. Surprisingly, the principle of electromagnetic reciprocity does not appear to preclude such an outcome. Our approach uses engineered asymmetrically scattering aerosol particles under real-time acoustic alignment. In this talk, I will share our initial work to develop an understanding of the optical effects of asymmetric single-particle scattering and how multiple scatterings from an aligned array of such particles translates into degradation of real-world imaging capabilities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Sparse apertures can greatly improve the size, weight, power and cost (SWaP-C) in comparison to larger multi-meter diameter monolithic apertures for Intelligence, Surveillance and Reconnaissance (ISR) applications. However, their system design is more complex and requires careful optimization of the aperture configuration. The Air Force Research Lab (AFRL) has developed a simulation toolkit that allows for rapid prototyping of sparse aperture designs to evaluate their optical performance that assists in concept design development. The toolkit is based upon a novel formalism that combines imaging and interferometry principles within the quasi-monochromatic approximation to build a complex synthetic pupil function containing all the sub-pupil phase information. Defining the sub-pupil phase information is customizable with user-defined low order Zernike’s (i.e. piston and tilt) or by importing higher order Zernike’s from commercial interferometric optical test results. This toolkit has been validated using a two-aperture sparse aperture testbed that is constructed completely from commercial off-the-shelf (COTS) components and is reconfigurable. In addition we have evaluated the performance of a six-aperture interferometric imaging array developed by HartSCI. With these recent advances in testing and modeling capabilities, AFRL hopes to investigate opportunities for sparse aperture technologies that integrate into the modern ISR domain.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Multi-wavelength digital holography is an imaging modality by which one collects multiple 2D coherent images of an object with different illumination wavelengths and digitally processes them in order to obtain a 3D coherent image. For conventional 3D imaging, the image formation process requires a fixed-phase relationship between the 2D coherent images, and, as such, object motion and/or vibration between 2D image collections can prevent the formation of a 3D image. One previously studied method compensates for object motion through the use of a second illumination frequency, or pilot tone. In this paper, we explore two altered 3D imaging modalities, dual-tone 3D imaging and stairstep 3D imaging, both of which, like pilot-tone 3D imaging, make use of holographic multiplexing. We compare conventional, pilot-tone, dual-tone, and stairstep 3D imaging in simulation. The results indicate that the stairstep approach offers the best performance for range imaging overall. Additionally, we employed a range unwrapping algorithm to unwrap ranges images for the modalities studied here. The results show that range unwrapping is usually successful if the measured discontinuities in the range dimension are no larger than half the range ambiguity interval.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Since the pioneering contributions of Labeyrie, researchers have made tremendous strides in developing techniques for imaging through turbulent media such as the atmosphere. Imaging through turbid (scattering) media is a more challenging problem. Historically, researchers assumed that scattered light is so chaotic that it carries little or no information. Their approach was to retrieve the weak ballistic (unscattered) signal in the presence of the dominant and confounding scattered signal. In more recent years, researchers have demonstrated focusing light and imaging through thin scattering volumes (diffusers) by actually utilizing the scattered light, illustrating that scattered light also carries information. However, these demonstrations require access to the object plane for inserting detectors, beacons, or a fluorescing agent. We introduce a novel method for imaging through an unknown diffuser with a strictly one-sided observation, wherein the observer has no access to the object plane (for illumination or diffuser characterization) nor does the observer need to label the object with a fluorescing agent. The method requires laser illumination, a digital-holographic data collection, and the use of an image-sharpness criterion to jointly estimate the specific diffuser response and the image that would be obtained by removing the diffuser. Estimation is accomplished with off-line processing after the data are acquired. We demonstrate the method in simulation using a thin diffuser. We also suggest a framework under which the method can be generalized for use with thick diffusers. Our approach shows promise for imaging into human tissue, clouds, fog, smoke, suspended particulates, tree canopy, or other scattering media.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Facial classification has numerous real-world applications in various fields such as security and surveillance. However, images collected at long range through the atmosphere exhibit spatially and temporally varying blur and geometric distortion due to turbulence; consequently, making facial identification challenging. A multispectral facial classification approach is proposed utilizing machine learning for long-range imaging. A method for simulating turbulence effects is applied to a multispectral face image database to generate turbulence-degraded images. The performance of the machine learning method for this classification task is assessed to explore the effectiveness of multispectral imaging for improving classification accuracy over long ranges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Remote sensing measurements are increasingly becoming diffraction-limited by default, taking advantage of the improved resolution and sensitivity afforded by AO systems. A new and diverse set of observations, using imagers, spectrographs, polarimeters, and others, will become possible even in the presence of “deep turbulence.” With this vision in mind, and recognizing the fact that the rate-limiting step in AO performance is ultimately governed by that of its wavefront sensor (WFS), the Beam Control Lab at the University of Notre Dame is developing remote sensing technologies that will advance a number of fields relevant to the academy and industry. Our recent lab experiments have shown that a Fresnel WFS is an order of magnitude more sensitive than a comparable Shack-Hartmann WFS and also immune to scintillation. In this presentation, I will describe progress in advancing the technology readiness level of the Fresnel WFS.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The estimation of phase errors due to atmospheric turbulence from digital-holography (HD) data with high throughput and low latency is critical for applications such as wavefront sensing and directed energy. The problem of focusing outgoing directed energy is particularly difficult because the phase errors must be estimated with extremely low latency for use in closed-loop correction of the outgoing light before the atmospheric parameters decorrelate. This low latency requirement necessitates that the phase distortion be estimated from a single shot of DH data. The Dynamic DH-MBIR (DDH-MBIR) algorithm has been shown to be capable of accurately estimating isoplanatic phase-errors using the expectation-maximization (EM) algorithm; however, DDH-MBIR was introduced using data that models only frozen flow of atmospheric turbulence. In this paper, we characterize the performance of the Dynamic DH-MBIR algorithm in more realistic settings. Specifically, Dynamic DH-MBIR produces accurate phase estimates in the case of moderate levels of atmospheric boiling.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present reconfigurable optical sensors that can directly detect spectral and spatial features of incident light, enabled by the reconfigurability of the devices and the implementation of machine learning algorithms for information encoding and decoding. Photosensors for image classification, spectral mixture analysis, autoencoding, and compressed sensing will be discussed. In all these devices, the computation is performed at the lowest possible level of the sensor system hierarchy – the physical level of photodetection – and does not require any external processing of the measurement data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In the past decade, sensor array designs have increasingly made use of the three-dimensionality of the sensor layer to improve sensor performance and to gain new measurement capabilities. With a playful mindset, we look into several outlandish concepts for volumetric sensor designs —designs that are physically plausible but which remain conceptual. With these designs, we consider some advanced measurement capabilities one could achieve with such volumetric sensor-layer architecture and outline how such designs might conceivably be physically realized.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The tilted shearing interferometer (tSI) wavefront sensor (WFS) is being developed to expand characterization capabilities for optical propagation experiments at the Air Force Research Laboratory's Environmental Laser Test Facility (ELTF). The instrument utilizes the phase retrieval technique of a digital holographic WFS with laterally sheared beams rather than a local oscillator reference. This WFS provides gradients similar to a Shack-Hartmann WFS, allowing it to benefit from all of the processing developed for the Small Mobile Atmospheric Sensing Hartmann (SMASH). At the same time, the interferometric nature of the wave front sensor provides access to additional information, i.e. branch cuts. Initial development of the tilted shearing interferometer in the Air Force Research Laboratory's Beam Control Lab (BCL) is presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this paper, two methods for identifying branch points from Shack–Hartmann wavefront sensor (SHWFS) measurements were studied; the circulation of phase gradients approach and the beam-spread approach. These approaches were tested using a simple optical-vortex model, wave-optics simulations, and with experimental data. It was found that these two approaches are synergistic regarding their abilities to detect branch points. Specifically, the beam-spread approach works best when the branch point is located towards the center of the SHWFS’s lenslet pupil, while the circulation of phase gradients approach works best when the branch point is located towards the edge of the SHWFS’s lenslet pupil. These behavior were observed studying the simple optical-vortex model; however, they were further corroborated with the wave-optics and experimental results. The developments presented within support researchers looking to study high scintillation optical-turbulence environments as well as will inform efforts looking to develop branch-point tolerant reconstruction algorithms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Gas densities of various gaseous species are measured up to 16.5 barg via spontaneous Rayleigh-Brillouin scattering (SRBS) induced by a focused 532-nm laser pulse. Laser pulse energy is monitored and controlled according to the gas density for enhancement of the signal-to-noise ratio (SNR) of the SRBS signal, while limited by the threshold of the laser-induced breakdown. SRBS signal is spectrally dispersed by a virtually imaged phased array (VIPA) and formed into a fringe pattern. The SRBS is reconstructed in the frequency domain from each fringe pattern and ensemble-averaged for gas density measurement.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper will propose a new method of measuring path integrated turbulence using Quick Response (QR) codes. The QR turbulence estimation theory will be presented, and results using a normal camera and the Digital Adaptive Optics system under development at the Naval Information Warfare Center Pacific will be explored.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The MMT Adaptive Optics exoPlanet Characterization System (MAPS) is a comprehensive update to the first generation MMT adaptive optics system (MMTAO), designed to produce a facility class suite of instruments whose purpose is to image nearby exoplanets. The system’s adaptive secondary mirror (ASM), although comprised in part of legacy components from the MMTAO ASM, represents a major leap forward in design, structure and function. The subject of this paper is the design, operation, achievements and technical issues of the MAPS adaptive secondary mirror. We discuss laboratory preparation for on-sky engineering runs, the results of those runs and the issues we discovered, what we learned about those issues in a follow up period of laboratory work, and the steps we are taking to mitigate them.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Standard methods of Shack-Hartmann wavefront reconstruction rely on solving a system of linear equations, extracting wavefront estimates from measured wavefront slopes, which are calculated by retrieving centroids from a Shack-Hartmann Wavefront Sensor (SHWFS). As the dimensions of a micro-lens array in the SHWFS increase, the computational cost of processing wavefronts can become increasingly expensive. For applications that require rapid and accurate computations, such as closed-loop adaptive-optic systems, traditional centroiding and the least-squares reconstruction becomes the main bottleneck limiting performance. In this work, we apply a convolutional neural network (CNN) approach to directly reconstruct wavefronts from raw SHWFS measurements, circumventing both bottlenecks. The CNN model utilizes the ResU-Net framework to perform a zonal wavefront reconstruction, and a method for preprocessing the raw data was investigated with the prospect of enhancing the accuracy of this model specifically for the zonal approach to wavefront reconstruction.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Temporally varying aberrations can increase the uncertainty of imaging based measurement techniques significantly. An example for this are three-dimensional microscopic flow measurements in oscillating droplets through the water-air-interface. In this contribution, 3D microscopy based on a Double-Helix Point Spread Function is combined with a real-time aberration correction in order to compensate the refraction at the fluctuating interface and to reduce the measurement uncertainty. The results have the potential to improve the water management of fuel cells and to reduce the consumption of fossil energy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Atmospheric Characterization I: Joint Session with Conferences 12691 and 12693
Sonic anemometers are devices that use ultrasound to provide instantaneous wind velocity and sonic temperature measurements. One of these devices, in conjunction with other meteorological equipment, provides characterization of the local atmosphere at a fixed point. Combining multiple sonic anemometers can provide an estimate of the index of refraction structure function, C2n(z), along a beam path. This work details this process for characterization of an optical propagation path for use in the evaluation of the performance of turbulence measurement instruments. Experimental results are presented from a one-kilometer horizontal path.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Estimating the prole of the index of refractive-index structure constant, C2n(z), is of great importance for characterizing the turbulence through which adaptive optical systems operate. Stereo Scintillation Detection and Ranging (SCIDAR) is one of the well developed techniques for making such a prole using light from binary stars. The Air Force Research Laboratory's Starre Optical Range (SOR) is beginning work to add a Stereo SCIDAR capability to the site. This work presents the development and testing of a stereo SCIDAR system in the Atmospheric Simulation and Adaptive Optics Laboratory Testbed (ASALT) at SOR. The stereo SCIDAR system was constructed on the ASALT lab's Multiconjugate Adaptive Optics (MCAO) bench, which features an enhanced atmospheric turbulence simulator (ATS) that can use up to 10 phase screens to test the capabilities of the stereo SCIDAR system in profiling distributed turbulence under a wide range of conditions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In an earlier work, we demonstrated a method to profile turbulence using time-lapse imagery of a distant target from five spatially separated cameras. Extended features on the target were tracked and by measuring the variances of the difference in wavefront tilts sensed between cameras due to all pairs of target features, turbulence information along the imaging path could be extracted. The method is relatively low cost and does not require sophisticated instrumentation. Turbulence can be sensed remotely from a single site without deployment of sources or sensors at the target location. Additionally, the method is phase-based, and hence has an advantage over irradiance-based techniques which suffer from saturation issues. The same concept has been applied to understand how turbulence changes with altitude in the surface layer. Short exposure images of a 30 m tall water tower were analyzed to obtain turbulence profiles along the imaging path. The experiment was performed over two clear days from mid-morning to early afternoon. The turbulence profiles show a drop in turbulence with altitude as expected. However, the rate at which turbulence decreased with altitude was different close to the ground from at higher altitudes.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Knowledge of the atmospheric conditions along an optical path is crucial to many experiments. A technique using differential scintillations was adapted for the Small Mobile Atmospheric Sensing Hartmann (SMASH) system to estimate proles of the refractive-index structure constant, C2n(z), and the wind speed. Estimates of those parameters from data taken along a 1 km horizontal path over level ground at a height of about five feet at Kirtland AFB is presented. Five sonic anemometers, placed along the path, serve as an independent estimate of the turbulence conditions with which to evaluate SMASH's performance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Atmospheric Characterization II: Joint Session with Conferences 12691 and 12693
Surface layer optical turbulence values in the form of CT2 or Cn2 are often calculated from surface layer temperature, moisture, and wind characteristics and compared to measurements from sonic anemometers, differential temperature sensors, and imaging systems. A key derived component needed in the surface layer turbulence calculations is the “Sensible Heat” value. Typically, the sensible heat is calculated using the “Bulk Aerodynamic Method” that assumes a certain surface roughness and a “friction velocity” that approximates the turbulence drag on temperature and moisture mixing from the change in the average surface layer vertical wind velocity. These assumptions/approximations generally only apply in free convection conditions. A more robust method, that applies when free convection conditions are not occurring, to obtain the sensible heat is via the Energy Balance or Bowen Ratio method. The use of the Bowen ratio – the ratio of sensible heat flux to latent heat flux – allows a more direct assessment of the optical turbulence-driving surface layer sensible heat flux than do more traditional assessments of surface layer sensible heat flux. This study compares surface layer CT2 and Cn2 values using sensible heat values from the bulk aerodynamic and energy balance methods to measurements from instruments such as sonic anemometers, differential temperature sensors, and time-lapse imagery. This research further compares improvements to the calculations gained by using sonic anemometer eddy covariance values to obtain the friction velocity, and including humidity effects via covariance methods or simply using virtual temperature from the sonic anemometers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The sonic anemometer makes rapid measurements of air temperature and wind velocity which are then used to quantify atmospheric turbulence. Turbulence strength is estimated from the parameters of a curve fit to a structure function computed from the measured data. This procedure was carried out for both experimental and simulated data and the differences between the results obtained were examined. Averaging effects due to the measurement interval caused changes in both measured and simulated results mostly represented by an offset in the simple theoretical structure function. An additional offset was observed in the simulated results due to frequencies cut off by the simulation method. This study also examined the effect of the finite sample length on the computed power spectrum and structure function. This effect appears to be unimportant for the Kolmogorov power spectrum usually presumed here, but it is shown that non-Kolmogorov power spectra don’t necessarily produce accurate results even in simulation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Turbulence and Aerosol Research Dynamic Interrogation System (TARDIS) is an optical sensing system that is based on dynamically changing the range between the collecting sensor and Rayleigh beacon during a static period of relatively unchanging turbulence-induced wavefront perturbations. In the past, obtaining measurement-based estimates of the turbulence strength profile from TARDIS was based around collating segmented refractive index structure parameter, Cn2 values traced to specific layers of the atmosphere. These values were developed from Fried parameter segments, which were deduced from differential tilt variance measurements from neighboring subapertures on the Shack-Hartmann wavefront sensor. In this work, we will exploit the crossings between the sensing paths from the different beacon locations (during a static period) to the wavefront sensor subapertures to derive turbulence profiles along the path. The differential tilt variance between a pair of subapertures due to a pair of beacons at two different ranges in a crossed sensing path configuration has a unique turbulence weighting function associated with it which depends on the geometry of the beacons and the subapertures. By using these unique path weighting functions along with the corresponding measured differential tilt variances for all configurations where the sensing paths cross, Cn2 profiles along the path can be constructed. The derivation of the weighting functions will be discussed and derived profiles will be compared to measurements from other profiling instruments such as MZA’s DELTA-Sky and to numerical weather prediction models.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Beam propagation systems are often used in a wide range of atmospheric environments. Therefore, it is important to be able to characterize those environments in order to appropriately assess performance and inform design decisions. In this paper, a variety of methods for measuring atmospheric coherence length, r0, were analyzed including a Shack–Hartmann-based differential image motion monitor (DIMM), gradient-tilt variance, slope discrepancy variance, and phase variance methods, as well as using the modulation transfer function (MTF). These methods were tested on varying turbulence strength environments with known atmospheric coherence lengths, first using a single modified von Kármán phase screen, then using full wave-optics simulations with 20 phase screens. The Shack–Hartmann based approaches were shown to greatly increase in error for d/r0 > 1 due to discrepancies between gradient tilt and the centroid tilt measured from the SHWFS’ image-plane irradiance patterns. An atmospheric data collection system was built and experimental results were taken for a beam propagating 2.4 km through a littoral environment over a 24 hour period.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper develops a phase reconstruction algorithm for the Shack–Hartmann wavefront sensor that is tolerant to sharp phase gradients, such as the ones imposed by shock waves. The implications of this will enable robust wavefront sensing in transonic, supersonic, and hypersonic environments using a Shack–Hartmann wavefront sensor.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Previously, Alyones et. al. (Appl Opt. 54 (2015)), showed that a pair of aerosol plumes — one purely scattering, the other purely absorbing—in the presence of a bright interferent, had the interesting property that the imaging contrast ratios through the pair of plumes were different in the two opposing directions. In this article, we generalize their earlier findings to plumes that have both absorption and scattering properties. In addition, by using the radiative transfer equation and its path integral solution, our method incorporates path length dependence, angular broadening due to multiple scattering, and arbitrary interferent location outside the zenith. We demonstrate that including these effects can significantly alter the asymmetry predicted by the model of Alyones et. al., and we provide estimates for the most optimal image contrast enhancements for different transport parameters.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Low angular separation point sources are vital in astronomy, such as the search for exoplanets. Traditional imaging is time-consuming, but we propose quantum-accelerated imaging (QAI) to reduce measurement time using an information-theoretic approach. QAI maximizes Fisher information per detected photon by adaptively learning optimal measurements from data. Using linear-projection instruments and single-photon detectors, we estimate the position, brightness, and number of unknown stars 10-100 times faster than direct imaging. QAI is scalable and applicable in high-speed imaging, fluorescence microscopy, and optical read-out of qubits beyond astronomy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We explore the use of plenoptic data for performing passive non-line-of-sight imaging using light scattered from interior hallways at visible, long-wave infrared, and terahertz frequencies. The use of longer wavelength radiation in the LWIR and THz bands can increase the retrievable NLOS image information in comparison to visible radiation. However, significant scattering effects at LWIR wavelengths and diffraction effects at the millimeter wavelengths of THz radiation present unique optical design challenges. In this paper, by assuming a general imaging system for light field capture, we provide a theoretical framework to describe measured NLOS information including scattering and diffraction effects. Our analysis combines a ray-based light field description of the plenoptic space with a Wigner distribution function formalism to provide an intuitive physical understanding of the limits of NLOS imaging. Further, based on the analysis, we provide a simple strategy to design optical measurement systems in the LWIR and THz wavelength ranges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Event-based camera (EBC) technology provides high-dynamic range operation and shows promise for efficient capture of spatio-temporal information, producing a sparse data stream and enabling consideration of nontraditional data processing solutions (e.g., new algorithms, neuromorphic processors, etc.). Given the fundamental difference in camera architecture, the EBC response and noise behavior differ considerably compared to standard CCD/CMOS framing sensors. These differences necessitate the development of new characterization techniques and sensor models to evaluate hardware performance and elucidate the trade-space between the two camera architectures. Laboratory characterization techniques reported previously include noise level as a function of static scene light level (background activity) and contrast responses referred to as S-curves. Here we present further progress on development of basic characterization methods and test capabilities for commercial-off-the-shelf (COTS) visible EBCs, with a focus on measurement of pixel deadtime (refractory period) including results for the 4th-generation sensor from Prophesee and Sony. Refractory period is empirically determined from analysis of the interspike intervals (ISIs), and results visualized using log-histograms of the minimum per-pixel ISI values for a subset of pixels activated by a controlled dynamic scene. Our tests of the Prophesee gen4 EVKv2 yield refractory period estimates ranging from 6.1 msec to 6.8 μsec going from the slowest (20) to fastest (100) settings of the relevant bias parameter, bias_refr. We also introduce and demonstrate the concept of pixel bandwidth measurement from data captured while viewing a static scene – based on recording data at a range of refractory period setting and then analyzing noise-event statistics. Finally, we present initial results for estimating and correcting EBC clock drift using a GPS PPS signal to generate special timing events in the event-list data streams generated by the DAVIS346 and DVXplorer EBCs from iniVation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this work, we describe a test setup and experiments for evaluating event-based-sensor (EBS) based imaging systems in detecting small, moving objects against complex backgrounds. First, design decisions and the components of the setup are described, followed by how the setup is useful for conducting statistical performance analysis. The system description is followed up by testing different system configurations, investigating several algorithms including motion estimation, and noise filtering, compared to a reference system. We measure their effect on performance through Receiver Operating Characteristic (ROC) analysis, testing across many scene variations. We provide the results with a set of plots coupled with discussion and analysis.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Single-pixel cameras (SPCs) and event-based sensors (EBS) are two technologies representing two different yet complementary sensing paradigms. Whereas single-pixel cameras make global measurements, from which local information is inferred, EBS cameras measure only local intensity variations and cannot naturally measure the global light level. In standard SPC systems, local information must be inferred from many global measurements and the total resolution of the camera is limited by the resolution of the random patterns displayed on the DMD. Event Based Sensors, however, measure the outlines of moving objects at a higher spatial resolution than the DMD patterns. By allowing the reconstruction to incorporate this extra edge information, super-resolution with respect to the DMD patterns can be achieved. The same idea holds in the time domain - the EBS camera is able to measure scene changes that occur faster than the DMD pattern display rate. In a traditional SPC, fast scene motion would lead to motion blur. The InSPIRED camera, however, can likely reconstruct crisp, high-framerate video by combining the low-resolution intensity information given by the SPC with the high-resolution edge-information given by the EBS camera. By marrying the sparse sampling of an SPC with the high temporal feature map that results from an EBS we will be able to define a new image sampling technique that will yield the low SWaP-C benefits of each while interrogating scene dynamics in a manner that the fields of compressive sensing and event based sensing cannot accomplish in isolation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This work employs optical spatial filtering in transmission to increase the contrast of objects imaged with the Event-Based Sensor (EBS), improving object detectability. EBS are asynchronous imaging sensors with integrated change detection capabilities, able to detect spatio-temporal changes in scene brightness as they occur. This change detection capability is implemented in electronics, which while providing advantages such as low power and low latency, limits its ability to detect low-contrast objects. To address this shortcoming, the EBS may be augmented with improvements to its low-contrast sensitivity, using optical, coherent high-spatial frequency filtering (HPF). In this work, we present optical HPF to improve detection performance in EBS imaging systems. Experimental measurements demonstrate that objects containing features with contrasts as low as 3.53% are discernable, which enables object detection with triple the sensitivity of the standalone EBS.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Event-based sensors represent an alternative paradigm in machine vision. As the commercial hardware ecosystem undergoes rapid maturation, the machine learning community is racing to unlock new opportunities which exploit these devices’ unique combination of microsecond-scale sampling, low data rates, and extreme dynamic range. This talk will cover recent advances in unconventional machine vision algorithms for these unconventional sensors, and highlights the need for the open source community to develop new benchmark tasks that accurately quantify the value of unconventional devices relative to traditional focal plane arrays.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this work, we demonstrate a hybrid event-based and frame-based imaging system for the task of target detection and tracking. This system leverages the change detection ability of event-based sensors (EBS) with relatively lower read-out bandwidth for the target detection task within a wide field of view, especially against static backgrounds. The frame-based sensor is used to track the detected target, as cued by the EBS sub-system, at higher spatial/angular resolution within a narrow field of view. Using field trials, the hybrid system is shown to detect and track a 30 cm small drone at stand-off distances up to 100 m at a speed of 12 m/s.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Neuromorphic cameras are capable of processing large amounts of information through asynchronously recording changes in photon levels across every pixel. Due to their recent insertion into the commercial market, research that characterizes these types of cameras is just emerging. Determining sensor capabilities outside a laboratory environment allows for understanding future applications of this technology. An experiment was made with the purpose of determining if the camera could detect laser scatter within the atmosphere and determine information about the laser. Experimentation in real-world environments observed that the camera can distinguish laser scatter with environmental backdrops at varying distances, determine the repetition frequency of the laser, and draw preliminary angle determination data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This work presents an imager architecture capable of forming frames with varying exposure time and focus from event-based imager data, in post processing. Conventional photography dictates that once a frame has been captured the information associated with the frame is controlled by the aperture/depth-of-field, the exposure time, and the ISO. The advent of technologies such as integral photography and plenoptic cameras made it possible to change the depth-of-field and plane-of-focus in post-processing. Recently, the concept of the digital coded exposure for event-based imagery was introduced, which allowed the formation of frames from event-data that respected a per-pixel, digitally specified, exposure-vs-time function. The digitally coded exposure allowed the formation of frames with different exposure time characteristics, on a per-pixel basis, in post-processing, from event-based imager data. In this work we present an imager architecture that leverages both plenoptic imager principles, and digital coded exposure principles, to result in an imager that allows regeneration of frames in post processing, in such a way that modification of two sides of the standard exposure triangle is possible. The proposed architecture could potentially also work with alternative imaging modalities such as a photon counting imager.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Neuromorphic cameras, or Event-based Vision Sensors (EVS), operate in a fundamentally different way than conventional frame-based cameras. Their unique operational paradigm results in a sparse stream of high temporal resolution output events which encode pixel-level brightness changes with low-latency and wide dynamic range. Recently, interest has grown in exploiting these capabilities for scientific studies; however, accurately reconstructing signals from the output event stream presents a challenge due to physical limitations of the analog circuits that implement logarithmic change detection. In this paper, we present simultaneous recordings of lightning strikes using both an event camera and frame-based high-speed camera. To our knowledge, this is the first side-by-side recording using these two sensor types in a real-world scene with challenging dynamics that include very fast and bright illumination changes. Our goal in this work is to accurately map the illumination to EVS output in order to better inform modeling and reconstruction of events from a real-scene. We first combine lab measurements of key performance metrics to inform an existing pixel model. We then use the high-speed frames as signal ground truth to simulate an event stream and refine parameter estimates to optimally match the event-based sensor response for several dozen pixels representing different regions of the scene. These results will be used to predict sensor response and develop methods to more precisely reconstruct lightning and sprite signals for Falcon ODIN, our upcoming International Space Station neuromorphic sensing mission.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Shock waves are a commonly observed phenomenon in transonic and supersonic flow. These nearly discontinuous flow features form as a result of flow disturbances propagating faster than the local speed of sound. Across a shock wave, flow properties such as pressure, temperature, and density can change dramatically. In this paper, the effects of the near discontinuous change in density due to the shock wave on Shack–Hartmann wavefront sensor (SHWFS) measurements are studied experimentally. Experiments were conducted in the Mach 2 wind tunnel located in the Aero-Effects Laboratory at Kirtland, AFB. To generate the oblique shock wave, a wedge model was placed in the tunnel. Two dimensional, time-resolved wavefront measurements were collected simultaneously with a SHWFs and a digital holography wavefront sensor (DHWFS). In this manner, results from the two wavefront sensor techniques could be compared and contrasted. It is shown that the shock wave caused significant higher order distortion within the SHWFS lenslets. Significant lenslet beam spreading and bifurcation were observed in the raw SHWFS intensity images. When compared with the DHWFS measurements, the SHWFS measurements under predicted the phase distortion caused by the shock by up to approximately 1π.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Experimental Shack–Hartmann Wavefront Sensor (SHWFS) measurements were collected and examined to further understand image-plane irradiance pattern behavior in the presence of potentially sharp thermodynamic gradients. An analysis of path-integrated phase and image-plane irradiance pattern spreading discovered regions within a weakly compressible shear layer where sharp gradients were previously not observed. This paper describes the analysis process and shows results which suggest that Shack-Hartmann wavefront sensors are not adequately resolving sharp thermodynamic gradients in aberrating flows.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The aero-optical distortions caused by supersonic mixing layers over an optical window are relevant to the performance of hypersonic vehicles. Such mixing layers are typically temperature-mismatched due to a need to cool the optical window. To investigate the effect of the mismatched temperature across the mixing layer created by blowing a cool air over a flat window, optical measurements of an M = 2 freestream flow with an M ≈ 0.56 cooling two-dimensional jet were taken using Shack-HartmannWFS and Schlieren photography techniques. Total temperature of the freestream flow was varied from 295K to 750K, while the total temperature of the cooling jet was kept constant at 295K. Parameters of the mixing flow were examined using optical velocity methods. A new scaling method for aero-optical distortions in a temperature-mismatched, species matched supersonic mixing layer is proposed, providing an improved linear fit compared to the previous model.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
At supersonic speeds, shock waves create steep gradients in the density of the flow field. These large gradients have been shown to adversely affect the accuracy of Shack-Hartmann Wavefront Sensor (SHWFS) during wavefront reconstruction. This is caused by higher-order beam distortions within the lenslet. In the presented work, the wavefront of a collimated beam propagating through a local shock region over a partially protruding cylinder body was measured using SHWFS and off-axis Digital Holography Wavefront Sensor (DHWFS). These measurements were taken simultaneously allowing for direct comparison. Further study was done on computational and post-processing methods of handling the higher order aberrations caused by the shock, as well as studying their effects on the resulting wavefront. By varying the incoming transonic Mach numbers, the shock strength and spatial extent could be adjusted, thus providing multiple scenarios for comparison. The experimental data presented in this work provides valuable insight into shock-induced effects on the resulting wavefront. These results help to further support the development of new methods for mitigating the adverse effects of shocks on well-established measurement methods such as SHWFS and off-axis DHWFS in similar applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Current Shack-Hartmann Wavefront Sensor (SHWFS) reconstruction algorithms for aero-optical research underperform in the presence of flow features with strong density gradients, such as shockwaves in supersonic flow. The large density variations in shockwaves violate the key underlying assumption of SHWFS; that local changes in the wavefront within the lenslet subaperture manifest primarily as tip/tilt. In these cases, the image-plane irradiance pattern of individual lenslets can exhibit high-order aberrations, resulting in non-tip/tilt behaviors. Standard least-squares wavefront reconstruction methods fail to accurately recover the wavefront in the presence of a shock due to the least-squares estimator’s tendency to give too much “influence” to outliers present in the measured SHWFS data, resulting in an underprediction of the optical-path difference (OPD) across the shock. A new algorithm is described to overcome the limitations of the standard least-squares reconstruction method. Two weighting functions are investigated with the aim of using additional intensity information to quantify the degree to which each subaperture is aberrated. The least-squares estimator is replaced with a robust estimator to perform outlier handling and regularizing terms are then used to further constrain the spatial organization of the solution. The problem of wavefront reconstruction is cast as a global functional optimization problem where minimization is achieved iteratively. The algorithm is then evaluated on a sample of Mach 6 SHWFS dot patterns where oblique shocks produce flow discontinuities. The results show that the algorithm is capable of accurately targeting discontinuous flow features as outliers, and subsequently altering those outliers, increasing the OPD as well as the sharpness of the shockwave structure.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Super-resolution in optical imaging refers to approaches that can boost spatial resolution beyond the diffraction limit. One approach to achieving super-resolution is the super-growth phenomenon. In supergrowing fields, the local growth rates can be faster than that dictated by the fastest Fourier component. Our initial investigations suggest that supergrowth is more sensitive than super-oscillation for attaining super-resolution. We present experimental synthesis and characterization of supergrowing fields in the laboratory—a critical step towards achieving super-resolution. This experimental demonstration will assist in determining the optimal super-growing fields that are realizable in the lab and can be utilized for super-resolution imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Imaging techniques with subdiffraction-limited spatial resolutions are highly desired for a deeper understanding of subcellular systems. Optical imaging enables high resolution under 200 nm while the visible light penetration depth is limited to merely 2 mm. Ultrasound images achieve two orders of magnitude lower resolutions but can penetrate two orders of magnitude deeper into a medium than optical images. This work combines the strengths of optical and acoustic imaging techniques through AuNP-based metasurfaces utilizing the photoacoustic effect that gold exhibits. Our novel imaging technique can simultaneously achieve high resolution and deep penetration depths without any destruction of media.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Brillouin spectroscopy and microscopy is widely used for remote sensing and imaging; however, the sensitivity of detection is often limited due to the available laser power for such measurements. Quantum light offers an alternative solution to such spectroscopy and imaging by reducing the noise below the classical shot-noise limit. We demonstrate the first results of Brillouin spectroscopy and imaging using quantum-enhanced detection and outline potential applications and future strategies for improved detection sensitivity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Fibrotic diseases account for one-third of deaths worldwide, making it essential to investigate the accompanying tissue microstructural changes that are critical to disease progression. This research focuses on the fibrotic extracellular matrices present in histological tissue sections, which can characterize disease progression. We demonstrate how bioinspired structural color can be utilized as a label-free technology to determine disease progression on a single nanostructured surface. This nanophotonic imaging platform characterizes the organization of fibrous biological tissues with distinct stain-free color responses. The colorimetric response of histological tissue sections interfaced with these nanostructured slides was quantitatively assessed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We develop single-pixel imaging accelerated via swept aggregate patterns (SPI-ASAP) that combines a digital micromirror device with laser scanning hardware to achieve pattern projection rates of up to 14.1 MHz and tunable frame sizes of up to 101×103 pixels. Meanwhile, leveraging the structural properties of S-cyclic matrices, a lightweight CS reconstruction algorithm, fully compatible with parallel computing, is developed for real-time video streaming at 100 frames per second (fps). SPI-ASAP allows reconfigurable imaging in both transmission and reflection modes, dynamic imaging under strong ambient light, and offline ultrahigh-speed imaging at speeds of up to 12,000 fps.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Strobe photography is an imaging technique with a long history of being used to capture ultrafast dynamics but requires repeatability in the event being recorded. For some dynamics processes, such as light interactions with biological tissues, repeatability is not guaranteed due to the high degree of heterogeneity of tissue. Here, we present a high-speed strobe imaging system based on a spinning rotary mirror capable of imaging at 1 million FPS with high pixel resolution (300 x 400) pixels. 2D and 3D views of the event can be captured by using a single more multiple imaging paths through the scene. This enables 3D reconstruction using projection methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Non-uniform motion blur, including effects commonly encountered in blur associated with atmospheric turbulence, can be estimated as a superposition of locally linear uniform blur kernels. Linear uniform blur kernels are modeled using two parameters, length and angle. In recent work, we have demonstrated the use of a regression-based Convolutional Neural Network (CNN) for robust blind estimation of the length and angle blur parameters of linear uniform blur kernels. In this work we extend the approach of regression-based CNNs to analyze patches in images and estimate the parameters of a locally-linear motion blur kernel, allowing us to model the blur field. We analyze the effectiveness of this patch-based approach versus patch size for two problems: synthetic images generated as a superposition of locally linear blurs, and synthetic images generated with a Zernike polynomial-based wavefront distortion applied at the pupil plane.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Atmospheric path characterization using a recently developed computational and visualization software platform is discussed. The software enables high-fidelity and high-performance modeling and simulation of optical wave propagation in realistic atmospheric conditions based on parallelized wave-optics algorithms integrated with numerical weather prediction data. It provides nowcast and forecast visualization capabilities that can be used for accurate and fast computation of laser beam and image characteristics along the propagation path.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper extends a recent fast method for simulating optical propagation through random media like atmospheric turbulence. The previously published method simulates arbitrary sources propagating through two phase screens to an observation plane using a semi-analytic technique. The new advancement in this paper covers specific cases of sources with a known, closed-form solution for the non-turbulent field as a function of propagation distance. With the closed-form expressions, the authors did additional analytic evaluation, which leaves even less numerical work for the computer. This reduces the computation further than treating the source as arbitrary. Further, this semi-analytic method has been extended to propagate the wave through a simple optical system to an image plane. The specific cases of off-axis planar and spherical waves can be used to simulate a collection of point spread functions (PSFs) that are partially correlated across the field of view. Subsequently, these PSFs can be used with an extended scene’s reflectance array to synthesize an incoherent, anisoplanatic image.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The decomposition of turbulence-induced phase aberrations into Zernike polynomials is performed using simulation and numerical techniques, extending well-known analytic results when scintillation is weak. A spherical-wave geometry is assumed. Strong scintillation (Rytov variance > 0.3) has an impact on the distribution of aberration strength, and this impact depends on range and wavelength. A saturation effect is observed. Anisotropy affects the distribution of aberrations between nearby Zernike orders. Non-Kolmogorov exponents lower than 11/3 in magnitude tend to reduce the lower-order aberrations and slightly enhance the higher aberrations, as expected. The interplay of strong turbulence, anisotropy, and non-Kolmogorov exponents is also explored. Significant deviations from the existing weak-turbulence theory are found in some cases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Modern atmospheric modeling tools and beam control systems do not account for beam degradation along the propagation path due to real-time weather-induced scintillation. Typically, the effect of atmospheric phase turbulence on Gaussian beam propagation is limited to a global scintillation index (SI) and the resulting fringe visibility over the propagation path; however, this model is insufficient when the propagation path traverses multiple, time-varying weather conditions with varying scintillation properties. The proposed model iterates on existing atmospheric modeling tools to include scintillation physics via Mie scattering. These scattering calculations may then be implemented as sub-steps inside the Split-Step Beam Propagation Method to account for time- and path-dependent scattering.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The advancement of adaptive optics (AO) has a tradition of using benchtop optical simulators to progress control technologies from concept towards fielded systems. This paper presents a reflective atmospheric turbulence simulator (RATS) for the Air Force Research Laboratory's Beam Control Laboratory (BCL) with which the next steps in AO will be tested. The reflective nature of the system allows operation over a broad range of wavelengths. RATS consists of six moveable phase screens etched with Kolmogorov turbulence phase patterns. The configuration of the system can be varied to simulate a wide range of atmospheric turbulence conditions setting the needed parameters of the Fried coherence length, Greenwood frequency, Rytov variance and the isoplanatic angle, to meet a given scenario. Shack-Hartmann measurements of the turbulence generated by RATS are compared to the system design.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This research presents an alternative method to represent aberrated wavefronts based on circular Bessel functions. These wavefronts are obtained by means of a Shack-Hartmann wavefront sensor prototype, which was previously statistical validated according to the official Mexican standard. We show experimental results obtained from two wavefronts aberrated by two ophthalmic trial lenses; one of them has a spherical aberration of -1.0 diopter and the other one has a defocus aberration of +1.0 diopter. Both wavefronts are shown in terms of circular Bessel functions and compared with their corresponding representation in Zernike polynomials.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The modern telecommunications infrastructure relies heavily on widely deployed optical fibre networks which serve as its cornerstone. Strains in the fibres are caused by mechanical forces from various sources of ambient vibrations such as human activities and seismic movements. This results in phase shifts in the light that travels through the fibres. Consequently, these phase shifts can be measured across the entire fibre, providing information about the initial vibration events, which makes them an ideal candidate for distributed seismic sensing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.