New optical component enables AR/VR glasses without ‘bug eye’ effect

Share this on social media:

A metaform is a new optical component that can combine with freeform optics to enable compact AR/VR headsets and eyewear. (Image: University of Rochester illustration/Michael Osadciw)

 

Researchers have combined freeform optics and a metasurface to deliver high-quality AR/VR glasses that are compact and easy to wear

The demand for high-resolution optical systems with a compact form factor, such as augmented/virtual reality (AR/VR) displays, sensors and mobile cameras, requires the creation of new optical component architectures1

Researchers from the University of Rochester’s Institute of Optics have therefore developed a novel technology that could help deliver compact and easy-to-wear glasses for AR/VR. Such attributes are sought after by consumers, with the researchers’ technology enabling the development of optics that deliver high-quality imagery, while avoiding the creation of glasses with a ‘bug eye’ look. 

Reporting their work in Science Advances, the researchers’ paper describes imprinting freeform optics with a nanophotonic optical element called a metasurface. The metasurface is a veritable forest of tiny, silver, nanoscale structures on a thin metallic film that conforms, in this advance, to the freeform shape of the optics – realising a new optical component the researchers call a ‘metaform’. 

The metaform is able to defy conventional laws of reflection, gathering the visible light rays entering an AR/VR eyepiece from all directions, and redirecting them directly into the human eye. 

Nick Vamivakas, one of the paper’s authors and a professor of quantum optics and quantum physics, likened the nanoscale structures to small-scale radio antennas. ‘When we actuate the device and illuminate it with the right wavelength, all of these antennas start oscillating, radiating a new light that delivers the image we want downstream.’ 

Jannick Rolland, another author of the paper, a professor of optical engineering and the director of the university’s Center for Freeform Optics, added: ‘Metasurfaces are also called “flat optics” so writing metasurfaces on freeform optics is creating an entirely new type of optical component. This kind of optical component can be applied to any mirrors or lenses, so we are already finding applications in other types of components, such as sensors and mobile cameras.’

Why weren’t freeform optics enough?

The first demonstration of the technology required many years to complete, according to the researchers. The goal, of course, is to direct the visible light entering the AR/VR glasses to the eye, with the new device using a freespace optical combiner to help achieve this. However, when the combiner is part of freeform optics that curve around the head to conform to an eyeglass format, not all of the light is directed to the eye. Freeform optics alone, therefore, could not solve this specific challenge.

This is why the researchers had to leverage a metasurface to build a new optical component. 

‘Integrating these technologies, freeform and metasurfaces, understanding how both of them interact with light, and leveraging that to get a good image was a major challenge,’ said lead author Daniel Nikolov, an optical engineer in Rolland’s group.

Fabrication challenges

Another obstacle was bridging from macroscale to nanoscale. The actual focusing device measures about 2.5mm across, but even that is 10,000 times larger than the smallest of the nanostructures imprinted on the freeform optic.

‘From a design standpoint, that meant changing the shape of the freeform lens and distributing the nanostructures on the lens in a way that the two of them work in synergy, so you get an optical device with a good optical performance,’ said Nikolov. 

This required finding a way to circumvent the inability to directly specify metasurfaces in optical design software. In fact, different programs were used to achieve an integrated metaform device.

Nikolov said fabrication was daunting, It required using electron-beam lithography, in which beams of electrons were used to cut away sections of the thin-film metasurface where the silver nanostructures needed to be deposited. Writing with electron beams on curved freeform surfaces is atypical, and required developing fabrication processes. 

The researchers used a JEOL electron-beam lithography machine at the University of Michigan’s Lurie Nanofabrication Facility. To write the metasurfaces on a curved freeform optic they first created a 3D map of the freeform surface using a laser-probe measuring system. The 3D map was then programmed into the JEOL to specify at what height each of the nanostructures needed to be fabricated. 

‘We were pushing the capabilities of the machine,’ Nikolov said. 

Successful fabrication was achieved after multiple iterations of the process. 

‘This is a dream come true,’ said Rolland. ‘This required integrated teamwork where every contribution was critical to the success of this project.’

Reference

[1] Science Advances Vol. 7, no. 18: Nikolov et al. ‘Metaform optics: Bridging nanophotonics and free-form optics’, DOI: 10.1126/sciadv.abe5112

--

Lidar-enabled AR heads-up display could improve road safety

Researchers from the universities of Cambridge, Oxford and University College London (UCL) have developed a lidar-based augmented reality (AR) heads-up display for use in vehicles.

Tests on a prototype version of the technology suggest that it could improve road safety by ‘seeing through’ objects to alert to potential hazards without distracting the driver.

The technology uses lidar data to create ultra-high-definition holographic representations of road objects which are beamed directly to the driver’s eyes, instead of the 2D windscreen projections used in most head-up displays. 

While the technology has not yet been tested in a car, early tests, based on data collected from a busy street in central London, showed that the holographic images appear in the driver’s field of view according to their actual position, creating an augmented reality. This could be particularly useful where objects such as road signs are hidden by large trees or trucks, for example, allowing the driver to ‘see through’ visual obstructions. The results are reported in the journal Optics Express.

‘Head-up displays are being incorporated into connected vehicles, and usually project information such as speed or fuel levels directly onto the windscreen in front of the driver, who must keep their eyes on the road,’ said lead author Jana Skirnewskaja, a PhD candidate from Cambridge’s Department of Engineering. ‘However, we wanted to go a step further by representing real objects as panoramic 3D projections.’ 

The researchers scanned Malet Street in central London by sending out millions of lidar pulses from multiple positions on the street. The lidar data was then combined with point cloud data, building up a 3D model.

Image based on lidar data (left), converted to a hologram (right).

(Image: J. Skirnewskaja et al. / University of Cambridge)

‘This way, we can stitch the scans together, building a whole scene, which doesn’t only capture trees, but cars, trucks, people, signs and everything else you would see on a typical city street,’ said co-author Phil Wilkes, a geographer who normally uses lidar to scan tropical forests.

‘Although the data we captured was from a stationary platform, it’s similar to the sensors that will be in the next generation of autonomous or semi-autonomous vehicles.’ 

When the 3D model of Malet Street was completed, the researchers then transformed various objects on the street into holographic projections. The point cloud data was processed by separation algorithms to identify and extract the target objects. 

Another algorithm was used to convert the target objects into computer-generated diffraction patterns. These data points were implemented into the optical setup to project 3D holographic objects into the driver’s field of view.

The optical setup is capable of projecting multiple layers of holograms with the help of advanced algorithms. The holographic projection can appear at different sizes and is aligned with the position of the represented real object on the street. For example, a hidden street sign would appear as a holographic projection relative to its actual position behind the obstruction, acting as an alert mechanism.

The researchers are now working to miniaturise the optical components used in their holographic setup so they can fit into a car. Once the setup is complete, vehicle tests on public roads in Cambridge will be carried out.

--

Featured product: LCOS spatial light modulators for colour-sequential holographic applications, from Holoeye

Holoeye offers two fast spatial light modulator models which are optimised for colour sequential phase operation. The devices are capable of addressing 3 x 8 bit within a frame (180Hz) using fast display versions for the visible range. The SLMs also feature an RGB light source sync connector for use with RGB colour-switchable laser sources. The Leto-3 SLM has a resolution of 1,920 x 1,080 pixels with 6.4μm pixel pitch. The standard driver unit has a size of only 97 x 80 x 19mm and the SLM can conveniently be integrated in optical setups.

The Luna SLM is based on an 0.39” Lcos microdisplay with a resolution of 1920 x 1080 pixels and 4.5μm pixel pitch. The driver Asic is embedded in the Lcos microdisplay itself. This saves board space which enables a very compact driver.

The display can even accept video data input via a 4-lane MIPI DSI, which enables even more compact electronics for industrial implementations.

For more information visit: https://holoeye.com/spatial-light-modulators