Greg Blackman looks at the optical setups of lithography systems and the software packages involved in their design
In July, IBM, in an alliance with Global Foundries and Samsung, announced the first 7nm node test chips. At this scale, according to IBM, more than 20 billion transistors could be placed on a silicon chip the size of a fingernail. This next scale of integrated circuit patterning will boost computing power in the silicon chips used in everything from mobile phones to cars.
The optical design of lithography machines has come a long way since the early mask alignment systems of the late 1950s. At that time transistor feature sizes were 30µm or 50µm, according to Dr Reinhard Voelkel, CEO of Swiss company Suss MicroOptics; most microprocessors today use 22nm and 14nm technology.
‘Over the last 20 years, illumination has played a major role in improving photolithography,’ commented Voelkel. ‘The lenses for today’s high-end steppers are on the atomic scale perfect, so you can’t much improve on what’s been achieved 10 years ago. But illumination made a big change by reducing diffraction effects and aberrations.’
Voelkel will give a keynote presentation on micro-optics for photolithography at the SPIE Optical Systems Design conference, taking place in Jena, Germany from 7 to 10 September.
Optically, the lens systems for deep UV lithography machines – the current mass production technology operating at 193nm – cannot do better than 38nm or 40nm, Voelkel said, so, in order to get to 14nm, semiconductor manufacturers use tricks like multiple patterning and advanced chemistry.
Deep UV lithography is based on a reticle, a photomask, that has a copy of the pattern to be printed on the silicon. This reticle is projected onto the wafer by the lens system, typically reduced by 4:1 or 5:1 depending on the system. Without any guiding optical elements, light travelling through the lens will accumulate aberrations, and diffraction effects will further reduce the resolution.
In order to reduce aberrations, the illumination system defines set illumination angles so that the light propagates in a well-defined manner, explained Voelkel. Defining the illumination light in a lithography system is referred to as customised illumination or pupil shaping, and is part of a holistic lithography approach.
‘Here, we need optical elements that can distribute the light in the right set of angles without any loss,’ Voelkel said. ‘Diffractive and refractive micro-optics are a key component for doing that, but could only be used when industry introduced the excimer laser as a light source for photolithography systems around 20 years ago.’
He added that, starting with annual (off-axis) illumination to reduce diffraction effects, the illumination settings used for projection lithography further evolved to quadruple, multipole, and then free-form illumination.
‘Another game changer was the replacement of light-mixing rods by micro-lens-based fly’s eye homogenisers, also referred to as Köhler integrators, in the illumination systems of projection steppers,’ Voelkel commented.
The intensity distribution in the pupil plane of a projection lens system is correlated with the angular spectrum of the illumination light. The finer the plane wave spectrum is, the finer the grid in the pupil plane, and the more precisely aberrations and diffraction effects can be suppressed – a method referred to as pupil filling. ‘Using light-mixing rods the number of nodes correlates with the number of reflections in the rod and is quite limited,’ explained Voelkel. ‘For a micro-lens-based optical integrator, each micro-lens generates a copy of the light source and a much higher degree of pupil filling is obtained.’
Reducing aberrations in lithography lens systems has been refined over many years, but essentially ray tracing software will be involved in most optical design work for lithography machines. ‘The job is to optimise the lenses to give as high a quality image as possible with an extremely good MTF [modulation transfer function], while controlling cost and manufacturability,’ commented Michael Gauvin, VP sales and marketing at Lambda Research. Lambda Research’s OSLO (Optics Software for Layout and Optimization) program has more than 900 designs and patents in its database, several of which concern lithography-type systems.
Optical designers will create a merit function in the program that drives the optimisation of the system, inputting targets such as best spot size or minimum wavefront error and selecting variables like radius of curvature. ‘For lithography applications, the designer might be looking for a multi-configuration capability, which requires non-sequential ray tracing for micro-optical designs, and to maximise performance with such elements in the optical path,’ explained Gauvin. ‘You have to create a flexible merit function which allows you to achieve this goal, and it’s almost impossible to do that without doing massive ray tracing for multiple configurations.’
Gauvin added that race tracing operations in lithography systems using micro-lens arrays could be on the order of a million to 10 million rays to verify image quality. ‘You need to trace a lot of rays to see the uniformity across the field.’
‘For lithography it’s all about the purity of the wavefront, so we calculate the root mean square (RMS) wavefront deviation and Zernike coefficients and other properties,’ remarked Dr John Rogers, senior scientist of imaging optics at software supplier Synopsys. He added that simulating the point spread function is also important for lithography applications for determining how wide the feature will be after the photoresist is developed.
There is also beam propagation. ‘There are some coherence effects that are either desired or not desired depending on the application, so there’s some physical optics beam propagation that the software carries out in order to verify that you’re getting what you want,’ Rogers added.
The pupil shaping technique, where only certain regions of the pupil of the projection system are illuminated in order to improve resolution, makes use of partial coherence. Zemax’s software OpticStudio contains a partially coherent image analysis feature, which takes into account the effect of source coherence.
Whether a light source is fully coherent or fully incoherent will give different imaging properties. In general, however, the illumination will fall somewhere in the middle – it will be partially coherent. ‘[For partially coherent sources] there will be different interference effects across the image, and modelling those accurately can be very important,’ Chris Normanshire, engineering services manager, Europe at Zemax, commented. ‘We have a number of software tools for optimising systems like those used for lithography, but there’s a specific tool in OpticStudio where you can project images onto your image plane and determine how the source coherence will affect the resolution, for example.’
‘Lithography is more than just the projection optics that do the illumination,’ noted Rogers. After the wafer is made, it has to be tested, as do the reticles. There are a lot of ancillary tools for lithography, such as focus sensors that keep the image focused on the wafer, overlay sensors that position the wafer to a fraction of a line width, and other alignment tools.
‘All of those ancillary systems have to operate at precisions competing with whatever line width is being printed,’ explained Rogers. ‘If it’s EUV [extreme ultraviolet], they have to be able to see the features that the EUV machine is printing, but they typically can’t use EUV.’
The same is true of 193nm lithography. ‘These tools are typically one generation back in technology, but they have to compete with the main printing technology,’ he continued. ‘For that reason, all of those alignment systems are extremely challenging to design optically. It’s not just about the projection optics when it comes to lithography systems.’ Synopsys has done a lot of modelling in the area of inspection and alignment tools in recent years.
All three software spokespeople, Gauvin, Rogers and Normanshire, spoke about optimising for the as-built system, rather than the theoretically perfect as-designed model. This involves inputting a certain degree of tolerance into the model to give a more realistic view of what the system will be like when it’s built.
‘Typical tolerances in lithography these days are very tight values that are impossible for ordinary optical systems to achieve,’ noted Rogers. Synopsys provides some software tools that are intended to help reduce the tolerance sensitivity of optical systems. ‘We use them routinely in our design work, and we assume that those designing lithography systems are also using them,’ he said.
‘It doesn’t matter at all what the as-designed performance is, what matters is the as-built performance,’ Rogers continued. ‘Within the optimisation software there will be a figure of merit for the optical system to determine how good it is, which is usually something like the RMS wavefront variance. The figure of merit is the quantity that the optimisation algorithm is working on to try to reduce to a small value.’
Historically, the merit function has been constructed to consist of the RMS wavefront variance for the system in its theoretical perfect state. ‘The old technique was to drive that to a minimum and afterwards ask how tight the tolerances need to be in order to get acceptable performance,’ Rogers said. ‘You have to design the system to be better than you need because the tolerances are going to make it somewhat worse.’
Synopsys has been working on not just optimising the as-designed performance, but to include in the merit function some calculation of what the performance will be when tolerances are considered. The goal is for the optimisation merit function to be the as-built performance rather than the as-designed performance.
‘This makes a considerable difference in the design,’ Rogers stated. ‘If you plot as-designed performance against as-built performance, you would hope there would be some degree of correlation, so that systems with good as-designed performance would also have good as-built performance. We found that with traditional merit functions there was almost no correlation, which is quite frightening really, because it means the optimiser is trying to optimise something that has very little correlation to what you care about.’
He went on to say that by modifying the merit function to include some measure of the as-built performance, then there is a strong correlation between what the optimiser is trying to achieve and what the user cares about. ‘That makes all the difference, and you get markedly different designs out of the software.
‘This functionality in the software won’t change the general look of a lithography optical system design, but it will change the details of how individual elements inside look, and how they are arranged,’ Rogers concluded.