Stephen Mounsey looks at some of the difficulties involved in taking the spectrometer to the sample
A large part of the electronics industry looks at making smaller, lighter, and cheaper devices; applications which may once have required a room full of equipment can now be undertaken by a device that fits into a hand or a pocket, and that can be carried around all day. Portability of electronic devices requires them to be sturdy, battery powered, and reasonably compact. In the public eye, these developments are centred on consumer electronics, such as phones and music players, but photonics has benefitted from the same miniaturisation revolution in the form of spectrometers. Portable spectrometers are now being used across a veritable smorgasbord of innovative applications. Rather than taking samples in the field and taking them back to a laboratory, users of portable spectrometry are able to take the spectrometer to the sample, allowing many more samples to be analysed. The manufacturers of the devices claim in many cases that the performance of their portable devices approaches that of their lab-based offerings.
So where did this start? While several companies are producing modern miniature spectrometers, Ocean Optics claims to have been the first to develop the devices. In 1989, the company’s founder Mike Morris was working for the University of South Florida with a view to developing a way of measuring the pH of the oceans at a depth of 100m. A $500,000 research grant lead to a reversible colour-changing pH sensor, but pH measurements at depth necessitated the development of a portable device featuring its own light source, gratings and CCD detector. The spectrometer sat in the boat, while fibre optics connected it to the pH sensor in the water. Upon completion of the project, Morris realised the utility of the device in field applications, and founded Ocean Optics based on the idea. According to Marco Snikkers, commercial director at Ocean Optics, the company has sold 150,000 spectrometers since its incorporation, and its competitors have had similar successes in their own niches.
Over the subsequent 20 years, the capability of portable spectrometers has greatly improved, according to Greg Neece, president of Dutch spectrometry specialist, Avantes. ‘The technology has been adapted over the years, and it has become more and more refined. The early instruments were similarly informed to what we’re producing these days, but the capability has just blossomed. We now have low cost, small, miniaturised spectrometers with capabilities approaching those of full-size devices,’ he says.
Neece says that Avantes is careful to work closely with each customer to ensure that the spectrometers supplied are tailored for each given problem. He notes that the company has a few off-the-shelf spectrometer products for common applications, such as LED measurement and calibration, but that most new sales require a degree of consultation in order to make sure the spectrometers do the job. ‘There are not a lot of field changes that the customer can do to an instrument once it’s constructed,’ he explains, ‘so we spend a lot of time with our customer understanding what their application will be, in order to properly define the optical piece. It’s all fixed-optic, and it’s all solid-state; once a device is calibrated at the factory, it’s not going to need recalibrating after that.’
One common application for portable spectroscopy is in food and beverage production, and Neece recounts the example of a customer wishing to quickly and easily measure the sugar content of potatoes. In order to meet the demands of this application, Avantes needed to know where the customer expected to see meaningful absorptions in the spectrum. Whether the absorption peak is of a broad or narrow bandwidth is also important: ‘If it’s a nice big, fat, broad peak, such as a sugar molecule might be, then we don’t need as high a resolution and we can increase our throughput,’ says Neece. On the other hand, if the absorption is expected to be narrow, the spectral resolution may need to be increased (often by using a narrower slit), and the throughput will decrease accordingly. ‘These are the trade-offs we play with when we start to work through an application,’ he adds.
The workings of spectrometers haven’t changed much since their introduction, and the major manufacturers’ devices all work on a similar principle: incident light is diffracted by way of a grating, and the diffracted light falls onto an electronic detector. However, according to Neece, the technology of the detectors themselves has advanced substantially over the years. Older devices, or more simple modern devices, use a linear silicon CCD sensor, whereas most modern detectors use a 2D CCD array, which may be silicon-based or based on advanced direct band-gap semiconductors, such as indium gallium arsenide (InGaAs).
‘Customers can choose the detector they want to use based on the application’s demands,’ explains Neece. ‘In the early days there was basically one detector – it was a silicon CCD. Now we have 10 or more, and we’re always looking at new ways to keep our detectors up-to-date with the market: back-thinned CCDs, new adaptations of InGaAs detectors… and there are detectors that are similar to CCDs, but with features such as individually addressable pixels, so that users can control the part of the chip they wish to use to a higher degree,’ he explains.
Silicon-based detectors are primarily for visible light, but Neece explains that their wavelength range can be altered to an extent by the application of coatings to the arrays, either in their entirety or only a portion of them. ‘It works quite well,’ he says. Coated silicon detectors are useful up to wavelengths of 1,100nm or so, so long as the application has a strong signal, but beyond this range, other technologies are needed, often at a significantly higher cost. ‘There’s a fundamental price point jump when you leave the silicon CCD,’ warns Neece. ‘This often means going from a very low cost detector to one that costs several thousand dollars.’ The costly direct-bandgap semiconductor detectors are led by InGaAs, but also include lead sulphide and others. ‘This is where we really want to try to help the customer; they’ve got a cost goal they’re trying to meet, and sometimes we can get there [using a silicon-based detector] by tricks and knowledge, but other times you just can’t do the job with a low cost detector, and you’ve got to bite the bullet.’
Snikkers from Ocean Optics reiterates this point: ‘Ideally we would like to have one detector that can cover the whole spectrum, but currently that’s not available. Silicon is much cheaper than InGaAs, and it has a higher yield during manufacturing, so detectors are much cheaper,’ he says. Additionally, silicon detectors can be bought in bulk from many places, as they are used in many applications. ‘InGaAs detectors, on the other hand, are already quite spectroscopy-specific,’ notes Snikkers, citing this as a reason for the higher component price. ‘If a user needs to go into the IR to do a specific application that has spectral features starting from 1μm and up, then there is an additional price for that.’
A growing application space for portable spectrometry is that of near-infrared (NIR) spectroscopy and InGaAs detectors are required for the wavelengths involved (800nm-2.5μm). NIR is similar to Raman spectroscopy in that rather than resolving individual transitions and excitations of electrons within sample molecules, as is the case with UV/visible spectroscopy and Fourier transform IR (FTIR) spectroscopy, NIR resolves the overtone of the entire molecule. The sensitivity is lower than other techniques, but NIR can give a good ‘fingerprint’ of the species present in a bulk sample.
Andrew King is division manager at Pacer International, a UK-based distributor of B&WTek’s range of spectrometers – miniature devices included. King explains that the portability of NIR spectrometers depends in part upon advances in detectors: ‘InGaAs technology came into play about five years ago,’ he says. ‘The spectral range of InGaAs arrays has increased in that time from 900-1.7μm initially, first to 900nm-2.2μm, and then to the current range of 900nm-2.5μm. [Near IR spectrometers] used to use single-point InGaAs detectors or photodiode arrays, but then they moved to multiple element InGaAs arrays, similar in structure to a CCD array.’ The resolution of the detectors also increased: ‘In the early days of an InGaAs array you’d get maybe 256 pixels, and about 18 months later it went to 512, and now it’s at 1,024 pixels on a single chip… with higher resolutions on their way,’ says King.
But why do more pixels in the detector make for a more effective spectrometer? King explains: ‘[Spectrometers] use a diffraction grating to disperse the light onto an array of detectors,’ he says, noting that light is diffracted differently according to its wavelength, leading to a separated spectrum at the detector. ‘That spread of light then activates each pixel depending on the intensity of the light that falls onto each pixel. So if we can increase the number of pixels in the detector, then we can improve the spectral resolution of the device.’
A mountaineer uses a Jaz spectrometer to measure the air’s oxygen content. Image courtesy of Ocean Optics.
King believes that such rapid development may be a factor contributing towards the technique’s relatively slow up-take: ‘The technology evolved a lot quicker than the applications could keep up with it. With near-infrared, few people have spent time looking at the application and understanding what information they can get from the spectral data. With Raman and conventional IR spectroscopy, you’re looking at the direct relationships between the energy of the photons and the molecular structure. Near-infrared on the other hand looks at overtones; it’s not a direct relationship, and it’s very complicated to actually understand what you’re seeing in the spectrum. Some people might be using NIR as a monitoring technique; as long as they have a starting point, and they know that something is changing, they’ll see that the NIR spectrum changes,’ he says. King notes that there have been many applications written about in specialist publications detailing ways in which NIR has been used in the food and agricultural markets, where the technique can provide a good idea of the moisture content and fat content of a sample. At a research level, he notes, there are applications under development for the characterisation of carbon nanotubes using NIR.
Towards universal versatility
Again, Ocean Optics is working along similar lines: ‘One of our customers is using NIR to measure the quality of food… they used to do it in big sorting machines, but now they’re doing a system where the grocery store itself can monitor the quality of the food,’ explains Marco Snikkers, noting that the machines monitor both freshness and quality of produce. ‘Orangina, for example, is only made from A-grade oranges, and the manufacturer pays a premium for those oranges, which forces the growers to do that quality check before selling,’ he says. Food producers have had sorting machines for some time, he notes, but the advent of affordable NIR spectroscopy means that these quality checks now extend further down the supply chain in order to monitor quality at all points.
NIR spectroscopy is useful due to its producing highly-characteristic spectra from a bulk sample, but the high cost of the InGaAs detector means that it is not suitable for every application.
Wilton, Maine-based portable spectrometer producer MicrOptix, specialises in hand-held visible spectrometers, both as stand-alone devices and OEM components. Norm LaVigne, director of sales and marketing at the company, explains that MicrOptix’ i-LAB handheld visible-light spectrometer has been used for similar food and beverage applications to the NIR techniques detailed above: ‘If you’ve got a batch of a food product, and its different ingredients absorb at different wavelengths, we can actually look at that whole visible spectrum and have the device compare that spectrum to a “golden sample”, reporting back on how closely it correlates,’ he says. This analysis is carried out on-board the device.
According to Ocean’s Snikkers, the algorithms used to process spectroscopy data in this way and get meaningful results used to be the work of research groups at universities, but customers now produce their own. Ocean Optics spent three years developing its flagship Jaz handheld spectrometer, and Snikkers notes that on-board processing is one of the most important features of that device: ‘It has an enormous amount of intelligence inside it,’ he says. ‘It has a complete microprocessor inside, and either we ourselves or our customers can load algorithms into the device, and adjust the functionality of the user interface to adapt to the users’ requirements. For example, we’ve just introduced our light meter to measure LEDs in an office space or a living room and measure the colour temperature, and so the Jaz becomes a colour temperature measuring piece of equipment. Other users put their chemometric models into the device and go on to measure water quality in lakes. Both customers are using the same spectrometer, but by applying their own applications, they change the functionality and the feature set of the equipment. We think that by adding that intelligence, we allow customisation of the spectrometer for a specific application – either by us or our OEM customers.’
While each piece of equipment has limits to its range of operation, due to the sensitivity of the detector, for example, the trend to watch for seems to be a move towards greater versatility, achieved through adapting a device’s software to meet each application as required. Having mirrored consumer electronics in the sense of smaller and lighter devices, spectrometers seem now to be following the route of cross-device integration; perhaps one day there will be a universal analysis device, built into the back of your mobile phone.