With the launch of the Oculus Rift and Microsoft's HoloLens in March, user adoption of virtual and augmented reality devices is set to increase. Jessica Rowbury looks at how optical modelling is aiding the development of VR and AR systems
Virtual and augmented reality headsets have created somewhat of buzz over the last few years, with tech giants like Microsoft, Facebook-owned Oculus, HTC, Samsung and Sony investing heavily in an attempt to lead this emerging market.
Although slow to take off in the consumer sector due to pricing and user experience issues, developers are creating improved systems that seem set to transform gaming, entertainment, and even everyday lives. And with the release of two much-anticipated devices – Microsoft’s HoloLens and Oculus Rift – at the end of March, it is anticipated to be the year when virtual and augmented reality devices hit the mainstream.
Both virtual reality (VR) and augmented reality (AR) displays can be modelled within optical software, allowing designers to identify whether light in their system is behaving as expected, and if not, alter its design. Engineers can therefore create a device that produces the intended user experience, without having to go through the lengthy and expensive process of building prototypes.
In general, ray-tracing – a technique used to simulate the way light travels through an optical system and how it interacts with different materials – is used to design VR or AR devices. ‘Often, with a visual system such as VR or AR, the system is traced from the eye toward the display, so the rays are traced backward,’ explained Dr John Rogers, senior scientist, imaging optics at modelling software provider Synopsys. ‘Since you know where the eye pupil is and the direction you’re looking at, it is easier to trace rays in the reverse direction – you simply start the rays and find out where they land on the detector.’
There are notable differences between VR and AR devices, which in turn present differences when designing these systems with optical software. VR systems, such as Oculus Rift, create an immersive computer-generated scene. ‘In this optical device, the user has a complete scene generated through the optics and the user can move around it by eye movement,’ said Michael Gauvin, vice president of sales and marketing at Lambda Research, which also supplies an optical modelling platform.
Augmented reality devices, such as the Google Glass or Microsoft’s HoloLens, layer computer-generated enhancements on top of real-life scenes; the user continues with everyday tasks, while interacting with additional digital information.
It is somewhat easier to design a device such as a headset that completely encloses the user in a virtual scene, rather than providing them with a simultaneous view of both the outside world and computer-generated data. The AR capability produces a higher number of light paths than in VR systems, and this can affect the quality of the scene.
‘[With] Google Glass, you are looking through the glasses at the outside world; that’s one path. The other path is the one where you’re taking an image and displaying it somewhere on the glasses that you’re wearing,’ explained Gauvin. ‘So, the eye can see your normal path, but at the same time if you shift focus and look at a certain area on your glasses, you can see a separate displayed image.
‘You want to keep these paths separate,’ Gauvin continued. ‘The non-sequential issues require that each path is free of stray light so that you don’t get ghosted images and multiple scatter path problems, which cause degradation of the scene quality.’
By using software simulation, engineers can break down each path as it goes through the system, visualise that path, and see where and how much that path is contributing light to the target (the eye). ‘The designer has to be able to quantify each stray light problem... by assessing each stray light path in their existing systems, and then moving, blocking, coating or painting both optical and non-optical components to remove problematic paths,’ said Gauvin.
Another challenge that comes with including the outside world with AR is ambient lighting. Unlike VR devices that are fully enclosed, stray light sources such as the sun or indoor lighting can affect how the systems work. And, for the device to be wearable in everyday life, the display needs to be bright enough to work outdoors.
Augmented reality has to be synchronised perfectly with the real world, in order to deceive the human eyes and make virtual objects appear as they would really exist in the scene. Developers must ensure that the physical and virtual images ‘align’, and that the eye is able to focus on both scenes. ‘The designer has to be careful that the see-through image of the outside world is not distorted,’ said Rogers at Synopsys. ‘If there is a mapping error with the image being projected from the display into the eye... then that distortion can be removed by changing or remapping the image at the display. But the outside world is not like that – if that image is distorted, then it is permanent in a sense; you cannot fix it in the software,’ Rogers added.
Small and stylish
Although VR and AR devices have only existed in the consumer sector for the last few years, the technology comes from heads-up displays (HUDs) that have existed for decades. These systems were developed initially so pilots could view information with their head looking forward, rather than looking down at the control panel.
HUD AR systems have also been developed for use in vehicles, to provide drivers with data without having to change their viewpoint, maximising the time their eyes are on the road. These systems display virtual information about speed, traffic, and routes, either directly onto, or in front of, the windshield. Zemax’s modelling software, OpticStudio, has been used to design these types of systems (see panel ‘Heads-up displays in cars’).
But HUDs used by pilots, particularly the early models, were huge, bulky helmets that barely share any resemblance to today’s Google Glass and Oculus Rift. As with most products aimed at consumers, manufacturers are constantly trying to make them smaller and more stylish.
‘This is the challenge for designers creating the next Google Glass, or Oculus Rift device, where you only have to wear a set of goggles, as opposed to a 35-pound heads-up display that pilots wore in the past,’ said Gauvin. ‘So we have dropped from 35 pounds to something that is little over one pound that sits on your face.
‘The systems, particularly goggles, need to be lightweight – people have to wear these goggles on their heads and if the weight is significant it can cause headaches and neck fatigue, so keeping the optics as simple as possible is a priority,’ Gauvin added.
‘Everything... in the design [of VR and AR systems] comes from companies trying to make the device lighter, less expensive.’
As well as decreasing the size of systems, developers are working to improve their features, a notable example being field-of-view. Most AR headsets on the market today only contain a small window in which the person looks through in order to view the augmented scene – if the eyes drift outside of this window, the scene is no longer in view. In an ideal world, AR developers would have a field-of-view of around 180 degrees, the same as a human’s.
‘Almost everyone wants to have a wide field of view,’ commented Synopsys’ Rogers. ‘If you trace backwards and you treat the eye like a searchlight that swings around and sends light in all directions, rather than as a detector, it produces a large cone of rays that leaves the eye. [In order to have a wide field of view] all of those rays need to be collected somehow and brought back to the detector, which usually leads to very large optics. So, it is a challenge [to keep] the system small and lightweight,’ he said.
Freeform optics are helping VR developers reduce the space and weight of their devices; designers are able to fold optical space and use less optical components which reduces the size and weight of the optical assemblies used. ‘In VR headsets, all of the optics – the lenses and prisms – are very, very close together, and you don’t have a lot of distance to work with, so freeform optics allow you to compact the whole system,’ Gauvin noted.
‘Using optical design programmes – such as [Lambda Research’s] Oslo – you can design something that goes off-axis and creates two different paths for this particular scenario, and then allows you to combine both paths with a very small focal length in a small package,’ he added.
Other manufacturers are using more modern components, such as holographic optics or miniature reflectors, to miniaturise their systems. ‘The leaders in the VR area are trying some very interesting things to keep their systems small – Synopsys’ Code V software can model all of these designs,’ Rogers said. ‘The software shows what happens when you try new, aggressive ideas or ways to decrease the size of the system.’
Developers can add functionality into a hologram, which is harder to do with bulk optics, explained Rogers: ‘With holographic optics, you can essentially program the hologram to do almost what you want.’
An example of this type of optic designed for use in AR systems is a holographic waveguide developed by the adaptive optics team at the UK National Physical Laboratory (NPL). The patented technology is currently being sold to developers of AR systems by UK company TruLife Optics.
The optic offers several features for AR devices: images can be displayed in high definition, full colour, in perfect focus and potentially in 3D through the centre of a field of vision. Importantly, the image is transparent, allowing for the overlay of information on whatever subject is being viewed. The optic is lightweight, less than 2mm thick, and can be easily mass-produced for consumer and industrial applications.
The product consists of a glass waveguide, approximately 10cm long, 3cm wide and 2.8mm in thickness, which contains two postage stamp-sized holograms. The light is transmitted into the first hologram and then turned 90 degrees through the length of the waveguide, via total internal reflection, before hitting the second hologram and being turned a further 90 degrees so it is projected into the human eye.
Microsoft’s HoloLens contains a see-through holographic lens that uses an optical projection system to create multi-dimensional full-colour holograms within an AR scene. The company has said that the HoloLens will take AR to the next level, by creating high-definition virtual 3D models of objects that appear either as part of real surroundings, or combine to make up entirely new ones.
Designing with these types of optics can pose difficulties, according to Rogers: ‘Holograms send light into various different orders. The order that you want might be doing the right thing, but a small fraction of light goes into an unwanted order, and you have to determine whether this results in a ghost image, stray light, or something else.
‘Because you have to track where the unwanted light goes in some of these modern image-forming devices, they are more challenging to design.’
With big players in the VR and AR space shipping headsets to thousands of consumers across the globe, commentators are predicting that 2016 will be a big year for virtual and augmented reality. And the growth is expected to continue, according to Gauvin, driven by gaming and entertainment applications.
‘The gaming industry keeps growing and growing with new software and users continually requesting better graphics resolution and full immersion into these programs... I am sure this trend will continue into the future,’ Gauvin concluded.
Automotive manufacturers are investigating using heads-up displays in cars to project a virtual dashboard on the windscreen. The idea is to maximise the time the driver’s eyes are on the road.
The optical design of these types of virtual displays can be modelled in OpticStudio from Zemax. One important function of the software is to enable optical engineers to design smaller systems, which is ultimately achieved by adding lenses and folding the optical path with mirrors.
Distances from the driver to the virtual image and from the driver to the windscreen are used in the calculations. The software is also used to design out any double image effects present in the virtual display. For example, BMW corrects for the double image using a plastic wedge within the windscreen to align the two images. The angle of the wedge is changed until the deviation of the two images is zero.