Tom Eddershaw investigates the use of lidar in driverless cars and driver assistance systems
A series of maps created using lidar were released for highly automated vehicle testing on 20 July. A day later a consortium of German car makers including BMW, Daimler, and Audi agreed to pay €2.5 billion for the company behind the maps, Here, according to the Wall Street Journal.
Lidar, or light detection and ranging, can be used to create the maps that autonomous vehicles require to navigate and can be used as a set of eyes for the autonomous system by providing instant 3D data of the surroundings. This means cars can brake automatically, avoid unexpected obstacles, or ensure that they stay in lane when on the roads.
Autonomous cars are being taken seriously by large car manufacturers; Daimler hopes to test self-driving trucks on German motorways this year as a continuation of work it has been doing in Nevada, USA, while Google’s driverless car project is probably the best well known of the research being done in this area.
While they are still some way off being a common sight in cities around the world, the ever increasing amount of Advanced Driver Assistance Systems (ADAS) are keeping photonics companies involved in lidar applications more than busy.
‘ADAS is the concept,’ explained Wade Appelman vice president of sales and marketing at Sensl, a company providing silicon photomultipliers for lidar applications. ‘There are all types of sub-functions that fall under ADAS, some can be automatic braking – where if the car in front of you brakes and you don’t brake fast enough, the car will brake for you; that uses a form of lidar. But this goes all the way through to what Google is doing where the whole car is controlled. It’s everything between lane monitoring and automatic braking to fully automated self driving vehicles.’
Driver assistance systems have created a market for lidar, and the development of parking assists and automatic braking has helped further the technology. But lidar wasn’t always an option for these ADAS applications as Appelman explained: ‘Lidar had been viewed as too expensive and therefore not widely deployable. But because of the cost reduction of these sensors that are used in lidar, such as silicon photomultipliers (SiPM), the use of lidar in ADAS is becoming more viable; it’s becoming more talked about, it’s becoming viewed as more of a solution, even in the last six months or a year.’
Lidar works by sending out light, waiting for it to be reflected by an obstacle and recording the time taken for it to return. The time is then used to calculate the distance from the source to the object and can be used to create incredibly detailed three dimensional maps.
In order for them to be mounted on vehicles, the system has to be rugged, cheap, and work accurately in full sunlight. Appelman said that conventionally system manufacturers used avalanche photodiodes (APDs) to collect the light returned, but that there was a rising demand for SiPM which Sensl manufacture. He noted that customers are finding APDs to be cost effective, but perform poorly in bright light applications, a requirement of a lidar system if it is to operate outside.
‘You need to be able to see a single photon return even in really bright light, and the only way to do that really well is with a SiPM. We have now had many customers go out and test these in a wide range of lidar applications, and say “SiPMs really do work and they actually have advantages over APDs”,’ Appelman said.
In addition to working better technically, SiPMs also have some business and system level advantages. ‘They are more cost effective, easier to use because of their lower voltage which is great on smaller devices, and they are very uniform which makes the pulse very easy to review,’ Appelman added.
Appelman noted that SiPMs weren’t an option until recently as they were too expensive and unproven. He said: ‘But what’s happened is SiPMs have picked up momentum in other applications.’ They were developed for the medical imaging market and initially used in nuclear medicine and PET scanners. A typical PET scanner might use 30,000 sensors in one scanner.
‘From a SensL perspective, probably 50 per cent of our customer base had been in medical imaging, 25 per cent had been in hazard or threat detection or high energy physics, which meant lidar and biophotonics were a smaller part of our overall market,’ Appelman commented. ‘But today lidar probably equates to 20 per cent of our business. If I was projecting, I could envisage lidar or 3D ranging, in the next five years basically being our largest market. We are not there yet, but I could see easily that 3D ranging and sensing space, of which ADAS is one example application, will turn into our largest segment.’
This statement will be reinforced when Sensl releases a red sensitive sensor in September this year. Appelman explained: ‘You will see us pushing more as our new products come out; you’ll see us beefing up our presence in the use of lidar. The red sensitive sensor is going to peak at 635nm where we will have 2.7 million amps per watt responsivity, and at 900nm we will have 960,000 amps per watt, all of this at an internal gain of 10 million. A typical APD can only achieve 25-50 amps per watt at a gain of 100.’
Hamamatsu is another optical sensor manufacturer that considers driverless cars as a prospect for the future. In a statement, the company commented: ‘We definitely see self-driving vehicles as future markets, as car makers will have to use multiple light-based sensor systems in order to meet the safety requirements for autonomous driving. One of them will surely be lidar for distance measurement. Today we see lidar used for pedestrian protection and automatic braking and cruise control (also at night, as with infrared light), which is an important new topic for the 5 star NCAP rating.’
Hamamatsu manufactures opto-semiconductor hybrid devices which can be used in lidar applications. Hybrid devices integrate a Hamamatsu photosensor such as a silicon avalanche photodiode (APD) or an InGaAs APD with a front-end IC for reading out the photosensor signals.
These sensors offer lower external noise effects and improved frequency characteristics; when compared to monolithic devices the dedicated fabrication process provides better performance and a short device evaluation period.
The devices were created using the company’s analogue CMOS technology. The CMOS IC can be customised to perform time-to-distance conversions and act as a distance measuring device, such as for lidar data production.
Choosing the right wavelength
As lidar systems are used in uncontrolled environments safety is a concern and the operating wavelengths must be carefully considered. This means laser glass manufacturers such as Schott must provide components that work very reliably in these spectral bands. Dr Simi George, a development scientist and laser materials expert at Schott, explained: ‘When you have a laser that is being operated for civilian applications, you don’t want that laser beam hitting anyone’s eye and damaging them.’
Retinal transmission begins to drop off at around 1,400nm, so as George stated, ‘anything past that wavelength is attractive’. In order to use a higher power while staying eye safe, Schott makes use of wavelengths found within the communications band, 1.5-2µm.
There are other considerations when choosing a wavelength. George said: ‘You have to pick a wavelength that doesn’t get absorbed into the air itself. If you go too long into the IR wavelengths the atmosphere absorbs the laser radiation... if you send a laser beam from point X and it gets absorbed by the time it reaches point Y, it’s not going to come back. So the selection of wavelength is really important.’
In order to ensure the light returns, the system requires highly coherent light which makes the laser gain material producing the beam important. Overall, George noted in the past two years, manufacturers are moving towards glass because of the higher coherency of laser light that can be provided from the material. She added: ‘There are different physical phenomena going on. In a crystalline environment, the energy level structure of the ion that is emitting the light changes. The chemical environment in which the rare earth ion is in has an influence on the laser emission due to the broadening mechanisms at play. Laser emission from glass has been demonstrated to have better beam qualities in these cases.’
Glass is also considered reasonably cheap, according to George, but there are some disadvantages. ‘One of the things that separate glass from crystals is its mechanical hardness. Because of the ordered nature of the crystal structure, it’s so hard. Whereas glass is an amorphic solid so it doesn’t have an ordered structure. Because of that it’s much easier to break than a crystal. That’s one of the things that we have been grappling with in order to move forward in high volume applications like this. We have to make sure we do other protective things like put hard coatings on the glass so that they work properly.’
Researchers have tried to merge glass and crystals, there is a material called glass ceramic which is in between glass and crystal. However, George said: ‘Unfortunately the material quality that you need to get laser emission out of the glass ceramic is just not high enough. You have crystals in the glass structure, so it scatters a lot more, and you have high transmission and losses. Getting laser light out of glass ceramics has been difficult; or getting really high quality glass ceramics without any impurities in it has been difficult. Any high quality glass ceramic suitable for lasers and available in the current market is really expensive.’
While it holds a lot of promise, as seen by George, driverless cars as a market is following an exponential growth pattern, and currently ‘we are on the flat part’, said George. It takes time to validate components for this sort of market. George added: ‘It takes huge investment from private companies in order to do product development like this – we just have to wait.’