Skip to main content

Art enlightened

As sensors and lasers become more freely available to artists, many have jumped at the chance to make their art interactive. Thanks to devices such as consumer-priced depth sensors, there are now a plethora of art installations and entertainment venues where members of the public can simply use their hands, legs and arms to manipulate lights and lasers in an art installation and make pretty shapes.

But for some artists, ‘pretty’ just isn’t enough and they want to push the available technology in order to realise their artistic ideas. When Christian Bauder from German design studio WhiteVoid was commissioned by car company Hyundai to create an installation at a car show, he knew he wanted to devise something 3D and interactive. The brief was to create something that reflected the car company’s fluid design process and so WhiteVoid set up the installation Fluidic, a 3D seemingly random distribution of spheres that are lit up using lasers.

‘There has been a lot of work done using projection technology, but we wanted to create a 3D piece of art using light,’ said Bauder. ‘In order to do this we had to use lasers because they can focus at different depths. We wanted to create a random 3D point cloud that would have the same density when viewed from any angle.’

To create the 3D pixels, Bauder and his colleagues found a bead manufacturer who was able to make solid spheres from a polymer with the ideal refractive index, so that when a laser hit it, the light would scatter inside the sphere and not only on its surface. ‘The spheres were not allowed to contain any bubbles,’ explained Bauder, ‘because the bubbles would disturb the refraction.’

Bauder and his colleagues used 12,000 of these spheres, each around four centimetres in diameter, to create a huge 3D cloud that measured around 12 metres across. Now came the challenge of ensuring that they could each be illuminated by a laser at the right time. ‘We knew that each sphere needed to have a clear path to a laser without any other spheres blocking it’s path,’ said Bauder. ‘Initially we thought we could create a random cloud of spheres, scan it and then position the lasers accordingly, but we soon realised this would not work. So we had to calculate where each sphere had to be and position each sphere individually.’

The high-power laser projectors were supplied by German company LaserAnimation Sollinger. Michael Sollinger, managing director of the company said: ‘Ensuring that the lasers hit the spheres accurately was essential not only for artistic reasons, but also for laser safety reasons – we could not have the laser missing a sphere and hitting a visitor. This was an extremely rewarding project to work on – we learned a lot about what our systems are capable of and the project pushed us to do things we had never tried before.’

Another artist has gone one step further. Dr David Glowacki, who describes himself as a scientist, artist and cultural theorist, came up with an idea for an art installation that actually resulted in advances which are now enabling new ideas in science and engineering.

Dr Glowacki is a Royal Society Research Fellow based at the University of Bristol; his research covers a variety of fields including classical and quantum dynamics, biochemistry, computer programming, atmospheric chemistry, scientific instrument development, optics and spectroscopy.

It was while attempting to explain his work to friends and family that he realised molecular dynamics uses a number of metaphors and expressions that are also used in dance. This basic realisation turned into a multi-award-winning interactive art installation that has toured the world and has been experienced by more than 100,000 people.

Called Danceroom Spectroscopy, the concept uses a depth sensor array and rigorous physics to interpret people as energy fields. This enables dancers, children and members of the public to interact with simulations of molecules and gain an insight into the beauty of this microscopic world.

‘We use an array of consumer-priced infrared depth sensors that utilise structured light to carry out real-time 3D imaging,’ explained Glowacki. ‘We combine this with software that interprets the human form as an energy landscape; this provides an interactive interface for embedding users in a molecular simulation which responds to the real-time motion of their fields.’

These interactions with the system result in feedback with both a visual and an audio component – users can see and hear how their energy fields are interacting with the molecular simulation using large projection displays and speakers. Essentially, they can play with atoms and molecules using their whole bodies.

In order to make sure the simulations were responsive, and that there was no delay between a dancer’s movement and the interaction display, Glowacki and his team had to work hard to speed up the simulations. ‘We have to process huge amounts of data very quickly, and we have been able to do this because our system uses GPU-accelerated computing,’ said Glowacki. ‘This means the code runs over multiple cores.’

For the brain to experience a fluid motion, Glowacki and his colleagues knew they had to achieve a latency of no more than 33ms. The system actually achieves 17ms. 

‘We realised that we needed to make the algorithms run very fast so that the dancers and their images could move in real time,’ said Glowacki. ‘This was a big challenge. We had to build a platform fusing the state-of-the-art in hardware and software to pull it off on something that was also portable.’

The work that Dr Glowacki and his colleagues did in order to speed up the algorithms has in turn enabled new work on molecular dynamics. Visualising molecular simulations is an indispensable tool that can accelerate research insights and improve communication between researchers. For example, using technology developed through Danceroom Spectroscopy, researchers can now manipulate a protein molecule using their own bodies or hands, choosing to fold it or stretch it and analysing how it behaves. Programming a simulation to perform these tasks takes a long time, but using this technology it can be done in real time.

Dr Glowacki said: ‘This is a fascinating example of where I think it’s fair to say that artistic aims pushed the boundaries of science and engineering.’



London-based design studio Marshmallow Laser Feast (MLF) has created many interactive installations using lasers, sensors and other technologies such as lidar and CT scans.

The company recently teamed up with pop star Will.i.am to create a giant musical stave using cars to play the notes. Filmed on a disused runway outside Madrid, the aim behind the ambitious project was for the singer and the drivers to hit an intricate sequence of lights and sensors on the tarmac in time to the music, creating a choreographed laser light show. A team of sixty-five technicians worked for more than seven days to build the rigging for the lights and 350 motion sensitive lasers to create what essentially was a giant laser harp with cars instead of fingers.

MLF has also used drones, lidar and CT scans to create a virtual reality experience in Grizedale forest in the Lake District, UK, famous for its sculptures. The project, called In the Eyes of the Animal, takes users on a journey that allows them to fly above the forest canopy, come face-to-face with high definition creatures and embody various animals as they traverse the Grizedale landscape.

Artistically interpreting the sensory world of the animals, Marshmallow Laser Feast has built a real time system that dynamically visualises precise lidar scans of the forest and CT scans of the animals. Binaural sound design also increases the audience’s sensory experience of the virtual environment by mimicking the natural perception of sound in space. Visitors were not only able to hear the animals’ environment through headphone-delivered audio, but were also able to ‘feel’ the sounds thanks to a wearable Sub Pac device that turns the audio vibrations into a tactile experience.

Topics

Media Partners