Circuit designs are becoming denser and more complex in advanced semiconductor process technologies. The foundry process windows are becoming smaller and smaller which increases sensitivity to wafer surface defects. These defects should be detected early to resolve the root causes and eventually help to improve the yield. Wafer defects are still often inspected manually while the defect counts can reach into the millions. It takes a long time to analyze and review the results while the identification of the root causes may be less accurate and buried in noise. In this paper, UMC advance research teams, in collaboration with the Cadence DFM team, utilized the Pegasus Computational Pattern Analytics (CPA) software to develop an enhanced inspection flow. This flow includes defect data preprocessing, classification, filtering, and reduction of huge data volumes to create visible and easy to review results. By finding more accurate root causes, we could reduce process develop time and finally improve wafer yields.
In state of the art integrated circuit industry for transistors gate length of 45nm and beyond, the sharp distinction between
design and fabrication phases is becoming inadequate for fast product development. Lithographical information along
with design rules has to be passed from foundries to designers, as these effects have to be taken into consideration during
the design stage to insure a Lithographically Friendly Design, which in turn demands new communication channels
between designers and foundries to provide the needed litho information. In the case of fabless design houses this
requirement is faced with some problems like incompatible EDA platforms at both ends, and confidential information
that can not be revealed by the foundry back to the design house.
In this paper we propose a framework in which we will try to demonstrate a systematic approach to match any
lithographical OPC solution from different EDA vendors into CalibreTM. The goal is to export how the design will look
on wafer from the foundry to the designers without saying how, or requiring installation of same EDA tools.
In the developed framework, we will demonstrate the flow used to match all steps used in developing OPC starting from
the lithography modeling and going through the OPC recipe. This is done by the use of automated scripts that
characterizes the existing OPC foundry solution, and identifies compatible counter parts in the CalibreTM domain to
generate an encrypted package that can be used at the designers' side.
Finally the framework will be verified using a developed test case.
The model calibration process, in a resolution enhancement technique (RET) flow, is one of the most
critical steps towards building an accurate OPC recipe. RET simulation platforms use models for predicting
latent images in the wafer due to exposure of different design layouts. Accurate models can precisely
capture the proximity effects for the lithographic process and help RET engineers build the proper recipes
to obtain high yield. To calibrate OPC models, test geometries are created and exposed through the
lithography environment that we want to model, and metrology data are collected for these geometries.
This data is then used to tune or calibrate the model parameters. Metrology tools usually provide critical
dimension (CD) data and not edge placement error (EPE - the displacement between the polygon and resist
edge) data however model calibration requires EPE data for simulation. To work around this problem, only
symmetrical geometries are used since, having this constraint, EPE can be easily extracted from CD measurements.
In real designs, it is more likely to encounter asymmetrical structures as well as complex 2D structures that
cannot easily be made symmetrical, especially when we talk about technology nodes for 65nm and beyond.
The absence of 2D and asymmetric test structures in the calibration process would require models to
interpolate or extrapolate the EPE's for these structures in a real design.
In this paper we present an approach to extract the EPE information from both SEM images and contours
extracted by the metrology tools for structures on test wafers, and directly use them in the calibration of a
55nm poly process. These new EPE structures would now mimic the complexity of real 2D designs. Each
of these structures can be individually weighed according to the data variance. Model accuracy is then
compared to the conventional method of calibration using symmetrical data only. The paper also illustrates
the ability of the new flow to extract more accurate measurement out of wafer data that are more immune to
errors compared to the conventional method.
In previous OPC model calibrations, most of the work was focused on how to calibrate a model for the best process
conditions. With process tolerance decreasing in coming lithography generations, it is increasingly important to be able
to predict pattern behavior through process window. Due to a low k1 factor that leads to a smaller process window, the
use of process window models is required for both optical proximity correction (OPC) and Lithography Rule Check
(LRC) applications to insure silicon success.
In this paper, we would try to calibrate multiple process window models. The resulting models will be verified and
judged using additional measurement data to demonstrate the quality.
Overlay variations between different layers in Integrated Circuits fabrication can result in poor circuit performance, even
worst it can cause circuit mal function and consequently affect process yield. Coupled with other lithographic process
variations this effect can be highly magnified. This leads to the fact that searching for interconnects hot spots should
include overlay variations into account. The accuracy of inclusion of the overlay variation effect comes at the expense of
a more complex simulation setup. Many issues should be taken into consideration including runtime, process
combinations to be considered and the feasibility of providing a hint function for correction.
In this paper we present a systematic approach for classification of interconnects durability through the lithographic
process, taking into account focus, dose and overlay variations, the approach also provides information about the cause
for the low durability that can be useful for building a more robust design.
This classification can be accessible at the layout design level. With this information in hand, designers can test the
layout while building up their circuit. Modifications to the layout for higher interconnects durability can be easily made.
These modifications would be extremely expensive if they had to be made after design house tape out.
We verify this method by showing real wafer failures, due to bad interconnect design, against interconnects' durability
classifications from our method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.