Skip to main content

Heightened reality: VR experts give optics wish list at Photonics West

Greg Blackman reports from a Photonics West panel discussion on virtual reality, where headset optical design will be instrumental for giving a better user experience

What will wearing the next generation of virtual reality (VR) headsets be like? Devices like Oculus Rift and HTC Vive have brought VR to the masses, but advances in hardware, not least the optical design, are going to determine to a large extent the kind of graphics content that can run on these headsets and ultimately the user experience.

There’s still work to be done on the optical science side of VR, Scott McEldowney, lead optics researcher at Oculus, commented during a panel discussion at SPIE Photonics West, the photonics trade fair held at the end of January in San Francisco.

‘Getting photons to the right places and our retinas at the right time is going to be key,’ he said.

In the next three to five years, McEldowney said that it’s not unrealistic to think VR headsets could incorporate 4K resolution displays. The question then to ask, he said, is how best to use all those pixels?

A HD mobile phone screen has a pixel density of around 500 pixels per inch, which Amazon’s Leo Baldwin, who was moderating the session, noted in his introduction might look fantastic in your hand 20 inches away, but up against your face will still have visible pixels. That’s a problem to do with display technology, but also with the optical system.

‘It’s about the pixel density you can see in your eye,’ McEldowney explained. ‘We have to think about pixels per degree.’

The current VR systems have around a 90-degree field of view, which gives about 15 pixels per degree. The human visual system, according to McEldowney, can see 120 pixels per degree.

‘If we just want to solve this with pixels, we’re going to have to get 10 times better than we are today, and that’s not going to happen any time soon,’ he said.

Pixels per degree is a function of field of view and pixel density. With more pixels, will it be beneficial to increase the resolution or widen the field of view? That’s a key question, McEldowney said, adding that it will probably be a combination of both.

However, he said that, from an optical standpoint, getting beyond a 90-degree field of view in such a small form factor is not a trivial thing. ‘I’m pretty convinced that there are emerging design forms [of optics] that will allow us to get 100-degree fields of view,’ he commented.

During the panel discussion, McEldowney listed some of the criteria VR system designers would ideally want from the optics. The first would be a wider field of view: the human visual system can see 220 degrees, while VR headsets today are at around 90 degrees. Secondly, to reach higher resolutions the optics have to have high modulation transfer function (MTF) at those resolutions, which, in addition to the wide field of view, is going to be ‘extremely challenging’, McEldowney remarked.

Current VR systems are only 8-bit, but a human eye can see 20 bits. Stray light artefacts will also have to be considered. In addition, these systems currently use fixed lenses, but the human eyes are dynamic focusing elements – ‘clearly you would want to have something like that,’ McEldowney said.

The final point McEldowney made was concerning ergonomics and being able wear these devices for hours at a time.

McEldowney added that VR companies will ultimately have to build prototypes to find out what optical design gives the best user experience.

The optics used in VR are not simple – it might just be one lens, but McEldowney described the optics as being a ‘unique compromise between a lot of different factors’ – but one of the ways to get the cost of the system down is to use inexpensive optics. Frank Black, director of B2B sales at HTC Vive, commented during the panel discussion that four years ago a typical VR system with a Zeiss lens was $40,000. Current systems are priced at around $700, and this has been achieved through reducing the cost of the lenses and employing shader technology to optimise the graphic pipeline, Black said.

Display technology

Whether it’s VR or AR, the displays will have to get better, according to McEldowney. ‘As an optical designer, the display I want really isn’t available,’ he said.

McEldowney would like to see a display that’s 30-40mm in diagonal and has pixel densities closer to that of microdisplays rather than those used for mobile phones. Microdisplays are too small, he said, adding he would ideally like a larger, consumer cost version of a microdisplay.

Along with optical design and new displays, clever techniques like foveated rendering – only rendering the portion of the screen the eye is focused on, rather than all the pixels – will come into play to make VR more realistic. Foveated rendering, however, needs eye tracking technology.

Black at HTC Vive said foveated rendering is a good example of using existing compute power to improve the user experience.

Eye tracking is also going to be the key, McEldowney added, for implementing dynamic focus lenses similar to those found in the eye. Dynamic focus would improve comfort and experience, but as to what sort of technology will be used is still an open question – will it be holographic, a multi-planar system, or a variable focus system? McEldowney asked.

VR had a strong presence at Photonics West, often as a marketing tool to give trade fair visitors a virtual tour of a company’s products, but there were also enabling technologies on display. One of the finalists of the Startup Challenge, where early stage entrepreneurs pitch their business to a team of judges, was TriLite Technologies, which had developed a laser light module for AR and VR applications.

To a certain extent, the content creation for VR will depend on the hardware available, but McEldowney observed that to get a good user experience, it’s not just the optics that need to be taken into account, but the display, tracking system, the graphics pipeline, and calibration.

‘When you think about the optics for the lenses, you really can’t just think about them as a lens issue, you have to think about it as a system issue,’ McEldowney said.

‘The only way to know if you are making the experience better is to build a system and actually experience it,’ he remarked, adding that getting from emerging technologies into prototypes and building systems as quickly as possible is going to be key to the adoption of many of these new technologies.

Related articles:

Enlightening realityJessica Rowbury looks at how optical modelling is aiding the development of VR and AR systems

Topics

Read more about:

AR | VR | MR

Media Partners