Medical Robotics, visualisation and navigation

Ole Jacob Elle
Group leader

Group Leader: Professor Ole Jakob Elle

Research aims

Most minimally invasive procedures restrict the access and direct vision to the regions, which require surgery. Such procedures require intra-operative image modalities such as x-ray, ultrasound or endoscopic images to be able to monitor the procedure in real-time. In many cases,this information is not sufcient to perform the procedure accurately and safely. Merging information acquired pre-operatively, mainly from for instance MRI, CT or PET, with intraoperative data can increase the basis for decisions and thereby improve the safety and accuracy of the procedure.The Medical Robotics, visualization and navigation group develops cutting edge technological solutions which support minimally invasive procedures. In particular, the group is focused on developing real-time image-seg mentation and -registration methods including artificial intelligence (AI) based methods on segmentation and deformable registration. Visualization and navigation is required to present the medical images to the surgeon intraoperatively. Different visualization techniques such as Mixed Reality (MR) visualization are explored. This is now a field of research focused on the use of Hololens or other 3D goggles for the visualization of 3D anatomical organ models for anatomical education, surgical planning, interventional guidance and communication among clinicians as well as with patients and relatives. 3D video will be more and more cross-linked with medical image information, as in Augmented Reality (AR) and move toward robotics and automation of surgical procedures. The research group is doing research in all these field of technology facilitating minimally invasive surgery. This includes the development of new monitoring technology e.g. accelerometer and gyro sensors with advanced signal processing for detection of changes in heart conditions as well as being in the forefront using AI and developing/applying Machine Learning algorithms for automation and decision support within patient monitoring and image processing/navigation.

Main research areas:

  • Develop new building block for navigation technology in different surgical disciplines like laparoscopic liver resection, neurosurgery and catheter-based interventions.
  • Develop Artificial Intelligence (AI) in order to automate processes and make decision support systems.
  • Develop extended Reality visualization including Mixed Reality, Augmented Reality and Virtual Reality for planning, guidance and communication
  • Develop robotic technology ranging from haptic feedback and augmented reality in tele-surgical systems, semi-autonomous and autonomous AI-driven Robotic systems for support in the hybrid operation rooms
  • Develop advanced remote support solutions for bringing experts to rural areas nationally and internationally, and support patient follow-up with monitoring and support in a new home-hospital setting
  • Develop biomedical modelling of organs like heart,liver etc., using advanced mathematical models like finite element (FEM) describing tissue properties, flow pattern for prediction and simulation
  • Develop New Monitoring technology e.g. accelerometer and gyro sensors in addition to data from vital signs monitoring using advanced signal processing and AI for detection and prediction of changes in heart conditions
  • Explore more research in areas like targeted treatment, new imaging techniques and micro technology

Contact information:
Group leader Associate professor Ole Jakob Elle, PhD, Sognsvannsveien 20, D6, 3rd floor.
Oslo University Hospital, The Intervention Centre, Tel: 23070112, 91171790
E-mail: oelle@ous-hf.no/oleje@ifi.uio.no