Surgery is all about precision. Physicians need a lot of experience and specialist knowledge in order to navigate all the tasks that they need to perform during an operation. In future, multisensory surgical robots with sensors featuring visual, auditory, and haptic functionality are expected to take on various individual steps during operations – quite autonomously – and support the surgeons providing treatment. Researchers from the University of Zurich are working on this, alongside clinicians, as part of the “FAROS” project.
Let us imagine a situation in the near future. Maybe 2031. A patient at Balgrist University Hospital is due to undergo spinal surgery. She is lying on the operating table, and anesthesia is taking effect with no problems. This is where the surgeons come in. Besides the experienced physician, three robot arms are also angled over the patient. They will be taking on part of the surgical work. The robots are made of rigid material, with ball joints providing the necessary agility. One has a small multifunctional tool attached to the so-called “end effector” at the front. This allows it to saw, cut, or drill. This robot arm interacts with the two others, which are able to take on further tasks. Hyperspectral cameras, which are integrated into the surgical light, provide detailed images that the human eye cannot perceive.
As well as being highly precise and intelligent, the surgical robots also feature numerous visual and non-visual sensors. With the help of these sensors, the robots can measure electrical and mechanical resistance when drilling through bone and deduce, from this, the body structure (i.e. bone or tissue) that they are working on. Ultrasound helps with locating anatomical features and instruments and provides the surgeon with images in real time. It is important not to damage any nerves or vital organs, so extremely high-precision work is required. The robot is able to take on certain substeps of the operation where maximum precision is called for, making it a valuable assistant.
«It is important that the people developing the robots know what we surgeons need.»
Mazda Farshad, Medical Director of Balgrist University Hospital
Back to the present. Robots are already being used in the operating theater, although these are only being guided by visual information. “Modern imaging is very helpful when planning for an operation. But if something unexpected happens, the surgeon is reliant on their own skill and also often on their own sense of touch,” explains Mazda Farshad. He is a spinal specialist, Medical Director of Balgrist University Hospital, and a professor in the Faculty of Medicine at the University of Zurich.
Farshad is working with Philipp Fürnstahl, a basic research specialist, on new modern robots that can be used as assistants in the operating theater. Fürnstahl is a professor of Orthopedic Research specializing in the application of computer technologies at the University of Zurich. Both academics belong to the “FAROS” project team, which is receiving assistance from the European Union’s Horizon 2020 research and innovation program. “FAROS” (Functionally Accurate Robotic Surgery) was launched on January 1, 2021, and is set to run for three and a half years. Besides the University of Zurich, the other institutions involved are KU Leuven in Belgium, Sorbonne University in France, and King's College London in England.
Balgrist University Hospital has taken on some clinical, experimental, and interdisciplinary tasks and helps link up the areas of robotics, informatics, and clinical research. “We were fortunate that FAROS was recognized before Switzerland was excluded from Horizon Europe,” points out Fürnstahl. The project received just short of 3 million euros from the European Commission over three years as part of Horizon 2020. Farshad stresses the value of collaboration with other universities: “We have pooled skills, avoided duplication, and work well together. It all flows very nicely.” The close collaboration between basic research specialists and clinicians at Balgrist itself is also going extremely well, since the hospital and the research building are located directly next to each other. “It is important that the people developing the robots know what we surgeons need. This is the only way to avoid producing things that prove useless or even dangerous in practice,” says Farshad.
The FAROS project, as described above, is concerned with developing surgical robots that – to all intents and purposes – can see, hear, and feel. This is made possible by numerous visual and non-visual sensors and the use of artificial intelligence (AI). For example, the robots learn how to hear and feel by analyzing vibroacoustic signals. Contact microphones play a part in this, and are applied to patients’ skin, where they measure the resonance of sound waves in their body. “We are currently validating the methods used in studies with tissue samples or human specimens,” says Fürnstahl. After many attempts, the robots learn, for example, when the drill leaves the hard layer of bone and starts to hit soft tissue. “The heterogeneous nature of bone or tissue alone means we need to gather lots of data as these structures are different in men, women, and children, and can also vary depending on age,” continues Fürnstahl.
«The OR-X offers ideal conditions for testing robots.»
Philipp Fürnstahl, Professor of Orthopaedic Computer Science
“Balgrist offers ideal conditions for testing robots, as we opened a state-of-the-art research and teaching center for surgery – known as OR-X – in August 2023,” explains Fürnstahl. OR-X consists of a fully fledged research-focused operating theater as well as several training laboratories where all manner of surgeries can be performed. “This is also very significant for future generations of clinicians,” argues Fürnstahl. “They get to learn in real conditions – including with the help of robots.” (See below for more about OR-X).
Researchers are able to practice with the robots on substeps that require a high level of precision – such as minimally invasive interventions. These are operations where the intervention only involves an extremely small incision. A small camera (called an endoscope) and instruments are then introduced into the patient via this incision. Normally, the surgeon has to guide the endoscope and instruments at the same time. In FAROS, the robot is responsible for guiding the endoscope, allowing the surgeon to devote all their attention to the instruments.
Will robots be able to perform operations themselves in future? “A standard operation consists of around 300 steps, with difficult operations involving up to 700 to 800 steps,” explains Farshad. During the operation, the physician is constantly analyzing the situation, which is something robots cannot yet do. Experts agree that technical support for surgeons makes sense and will be integral to what happens in the operating theater – there is definitely no going back. But they are also sure that robots will be restricted to an assistant-type role in the short term. “They will improve the way that physicians work and make this more precise, but the surgeon will still be irreplaceable in the operating theater of the future,” confirms Farshad.
With its 10th operating theater «OR-X», Balgrist University Hospital has created an innovative surgical research and teaching center, which has been operational since August 2023. The translational platform offers researchers and developers the opportunity to test and further develop new technologies and innovations like surgical robots. Students and physicians will be able to improve their practical surgical skills at OX-R as part of training and continuing education courses.
Endoscope: A medical device that is used to examine and visualize the inside of the body. It consists of a flexible tube or a rigid one fitted with a light source and a camera. Endoscopes provide views of the inside of the body, without any need for major surgical interventions.
Haptic Sensor: A technological device that captures information through touch, pressure, and other forms of physical contact, enabling interaction with the environment.
Hyperspectral: Refers to the ability of cameras to pick up light in many different spectral ranges (colors) that are not normally visible to the human eye.