ANALYSIS & OPINION
Tags: 

Photonics-guided robots to transform operating rooms, experts say

Smart surgical tools will significantly improve the accuracy of operations, according to speakers at the European Photonic Industry Consortium’s recent AGM, and photonics will play a key role. Matthew Dale reports 

Photonics and robotics technologies will be crucial for helping healthcare authorities round the globe cope with aging populations, experts claimed at this year’s EPIC AGM, held on the High Tech Campus in Eindhoven last month.

By 2030, it is estimated that half of the global population will suffer from chronic diseases, which will account for 83 per cent of total healthcare spending, said Dr Jacques Souquet, founder of SuperSonic Imagine in his keynote talk, adding that that fast, accurate and reliable diagnostics will be crucial for addressing this growing strain on medicine.

Other keynote speakers noted that smart instruments and highly dexterous photonics-guided robots could soon become commonplace in the ‘hybrid operating room’, an environment where surgeons will work alongside real-time imaging technologies to perform highly precise, minimally invasive procedures.

Hybrid operating rooms enable surgeons to work alongside real-time imaging technologies to perform highly precise, minimally invasive procedures. (Credit: Philips)

Dr Benno Hendriks, a research fellow at Philips Research, introduced the concept of the hybrid operating room after pointing out similarities between modern surgical procedures and those carried out in the 1950s, where surgeons had to rely on their own eyesight and skill alone, regardless of how many images or diagnostic reports they had looked at beforehand. Philips is considered to be a pioneer of these hybrid operating rooms, having installed over 750 of them globally.

‘In the past, the practice in surgery was first to cut, then see, whereas now surgeons are using diagnostics to first see, then cut, guided by their own eyes,’ explained Hendriks in his keynote presentation. ‘Now, the trend is moving towards first seeing, then navigating using a range of sophisticated navigation tools to bring the surgeon to the area being treated, then before treating, acquiring feedback on what is being done, and then finally performing minimally invasive cutting.’

According to Hendriks, imaging equipment is now not only used in the diagnostic phase, but is also being brought into surgeries themselves. ‘This allows surgeons to see, analyse and navigate tissue, meaning they can then perform these minimal cuts,’ he commented.

Real-time solutions are already being used to diagnose problems inside the heart or blood vessels, using catheters integrated with x-ray guidance, ultrasound and other imaging technologies. This enables multiple perspectives of the operation to be viewed in real time to ensure maximum surgical precision.

Existing technologies such as endoscopy will aid the development of hybrid operating rooms, said Hendriks, but there is a need for new instruments that can enable tissue tracking. ‘It’s not just enough to know the movements of the body, but also the movements of the components inside the body.’

Hendriks added that when using catheters, doctors can ‘follow the blood vessels like roads.’ However, performing surgery outside of the blood vessels can be difficult, as the surgeon has to go ‘off-road.’

The solution to this lies in the development of smart instruments that can ‘read tissues with light’, according to Hendriks. A ‘smart’ biopsy needle with an integrated spectrometer was shown as way of targeting specific lesions inside the body. ‘The instrument uses a light source coupled with a fibre to carry the light through the instrument to the device tip,’ Hendriks explained. ‘The light shines inside the body and interacts with the tissue by being absorbed and scattered; it then is picked up by a second fibre which carries the signal back into the instrument.’ This would allow surgeons to determine the kind of tissue that they are about to treat.

The miniaturisation of photonics technology is crucial to the development of smart instruments like the biopsy needle. ‘Photonics plays a very important role in image-guided surgery … this technology requires compact spectrometers and light sources with a broad spectral range,’ confirmed Hendriks.

Dr Chris Dainty, an associate at Cambridge Consultants, explained in his keynote that highly dextrous robots are being developed that will one day be able to perform operations that are currently only done by hand. Dainty noted that the company has produced such a robot for use in cataract operations, the most frequently performed surgery in the world; 20 million are performed each year globally.

‘The whole operation currently takes about eight minutes,’ commented Dainty. ‘The surgeon has to go in and scrape away a cloudy lens that’s about 10mm in diameter and replace it with a plastic lens – this is currently performed by hand with tweezers.’ In a video shown as part of Dainty’s presentation, Chris Wagner, head of advanced surgical systems at Cambridge Consultants, explained that ‘this is an area that seems ripe for the sort of benefits robots could provide – motion scaling, tremble reduction, and image guidance to avoid sensitive tissue.’

The new 1.8mm diameter robot uses a parallel mechanism that minimises movement outside the body while granting full performance inside the body – an ability difficult to achieve with traditional surgical robots that require large amounts of space to operate. ‘The new robot is small on the inside and on the outside, and is able to be integrated easily into the operation workflow,’ Wagner continued. ‘This demonstrates that robotics technology can be applied on a scale that has never been addressed before.’

The company not only envisions the new design being used in cataract and eye surgery, but also enabling a number of different neuro stimulator implants, early cancer diagnostics and flexible endoscopic treatments that have not been possible before because of the size of robotic mechanisms.

Dainty drew his keynote to a close by asking attendees to consider how photonics could be incorporated into this surgical tool, which could be used to deliver laser light, or be guided by imagers or spectroscopic sensing. ‘We need a lot of photonics to help us with this robot,’ Dainty remarked.

SuperSonic founder Dr Jacques Souquet added that the future of medicine needs to be preventative, predictive and personalised, and the continued miniaturisation of medical technology will help achieve this. In addition, the development of new battery and display technologies in the video gaming industry could also impact medicine, he said, as augmented reality and 3D imaging are already being considered for use in surgical operations.

In January Philips announced the development of its own augmented-reality surgical navigation technology, which will add additional capabilities to the company’s low-dose x-ray system. The system uses high-resolution optical cameras mounted on a flat panel x-ray detector to image the surface of the patient. It then combines the external view captured by the cameras and the internal 3D view of the patient acquired by the x-ray system to construct a 3D augmented-reality view of the patient’s external and internal anatomy. This real-time 3D view of the patient’s spine in relation to the incision sites in the skin aims to improve procedure planning, surgical tool navigation and implant accuracy, as well as reducing procedure times.

Philips' new augmented-reality surgical navigation system presents a combined 3D image of the external and internal views of a patient’s anatomy. (Credit: Philips)

Philips is working to install the new system in the hybrid operating rooms used by a network of ten clinical collaborators to advance the technology.

‘This new technology allows us to intraoperatively make a high-resolution 3D image of the patient’s spine, plan the optimal device path, and subsequently place pedicle screws using the system’s fully-automatic augmented-reality navigation,’ said Dr Skúlason of the Landspitali University Hospital, Reykjavik, Iceland. ‘We can also check the overall result in 3D in the operating room without the need to move the patient to a CT scanner. And all this can be done without any radiation exposure to the surgeon and with minimal dose to the patient.’

The results of the first pre-clinical study using the new technology have been published in the journal Spine, showing that the augmented-reality surgical navigation system can increase accuracy significantly while placing pedicle screws in the spine.

Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon