Traditionally, Optical Engineers have relied on having a good starting point - a design that comes close to the desired goals - and then tweaking that to make it work. However, having such a starting point is rare. Except in a few simple cases, there is no closed-form solution to the lens design problem. One must think, go through trials, learn from experience, and iterate.
The Automatic Design Tools in SYNOPSYS™ are created to provide an effective solution by putting as much of the design burden on the computer, freeing Engineers from the most tedious of these traditional tasks. It aims to ease the process in finding good starting points, and to facilitate exploration of the design space to discover alternative design forms which may deliver better performances not normally realized using conventional design protocols.
In this video, we will illustrate the use of the following Design Tools:
• Design Search (DSEARCH): A search tool for fixed focus systems.
• Automatic Element Scanning Tools: These tools scan the lens systems to find the best place to insert or delete elements or to use an unusual surface such as aspheric.
We start with an introduction to SYNOPSYS™ and its automatic design search tools. We will use a seven- element system as an example to illustrate the DSEARCH tool in SYNOPSYS and how the system tolerance requirements can be relaxed when searched with the tolerance desensitization goal.
Then we will show how to use the automatic element insertion/deletion tools to improve the lens system.
We studied the performance of an OCT imaging modality on the task of detecting an abnormality in biological tissue. Optical propagation in biological samples is dominated by scattering due to fluctuations in refractive index. We used the first order multiple scattering approximation to describe the scattered field from the tissue. The biological tissue was described by it permittivity field and the corresponding scattering potential. The normal state of the tissue (the background) was modeled as a spatial Poisson field of randomly distributed scattering centers, and the abnormality (the target) as a region with a higher concentration of scattering centers embedded in the background. The target detectability was then calculated using a quadratic observer. We considered the effect of fluctuations from the broadband source, the shot noise fluctuation of the imaging system, and the scattering noise due to refractive index fluctuation in the biological tissue. We also studied the detectability of an embedded abnormality in biological tissue with respect to to size of th abnormality.
Optical coherence tomography (OCT) is an interferometric technique using the low coherence property of light to axially image at high resolution in biological tissue samples. Transverse imaging is obtained with two-dimensional scanning and transverse resolution is limited by the size of the scanning beam at the imaging point. The most common metrics used for determining the axial resolution of an OCT system are the full-width-at-half-maximum (FWHM), the absolute square integral (ASI), and the root-mean-square (RMS) width of the axial PSF of the system, where the PSF of an OCT system is defined as the envelope of the interference fringes when the sample has been replaced by a simple mirror. Such metrics do not take into account the types of biological tissue samples being imaged. In this paper we define resolution in terms of the instrument and the biological
sample combined by defining a resolution task and computing the associated detectability index and area under the receiver operating characteristic curve (AUC). The detectability index was computed using the Hotelling observer or best linear observer. Results of simulations demonstrate that resolution is best quantified as a
probability of resolving two layers, and the impact on resolution of variations in the index of refraction between the layers is clearly demonstrated.
We report certain diffraction effects that are pertinent to the operation of double-layer optical recording media. The diffraction of light from the out-of-focus layer and the resulting distribution on the in-focus layer are studied using computer simulations. The findings are then verified by direct measurements. We also describe a technique for analyzing (by computer simulation) the focus-error signa, FES, in systems that use the astigmatic method in conjunction with the double-layer disk. The results of our computer simulations of the FES are compared with those measured in an actual disk drive; good agreement between computation and measurement is obtained.
Conference Committee Involvement (3)
Current Developments in Lens Design and Optical Engineering XXV
20 August 2024 | San Diego, California, United States
Current Developments in Lens Design and Optical Engineering XXIV
22 August 2023 | San Diego, California, United States
Current Developments in Lens Design and Optical Engineering XXIII
23 August 2022 | San Diego, California, United States
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.