IMHOTEP: virtual reality framework for surgical applications


IMHOTEP: virtual reality framework for surgical applications, Pfeiffer, M., Kenngott, H., Preukschas, A. et al. Int J CARS (2018).


The data which is available to surgeons before, during and after surgery is steadily increasing in quantity as well as diversity. When planning a patient’s treatment, this large amount of information can be difficult to interpret. To aid in processing the information, new methods need to be found to present multimodal patient data, ideally combining textual, imagery, temporal and 3D data in a holistic and context-aware system.

We present an open-source framework which allows handling of patient data in a virtual reality (VR) environment. By using VR technology, the workspace available to the surgeon is maximized and 3D patient data is rendered in stereo, which increases depth perception. The framework organizes the data into workspaces and contains tools which allow users to control, manipulate and enhance the data. Due to the framework’s modular design, it can easily be adapted and extended for various clinical applications.

The framework was evaluated by clinical personnel (77 participants). The majority of the group stated that a complex surgical situation is easier to comprehend by using the framework, and that it is very well suited for education. Furthermore, the application to various clinical scenarios—including the simulation of excitation propagation in the human atrium—demonstrated the framework’s adaptability. As a feasibility study, the framework was used during the planning phase of the surgical removal of a large central carcinoma from a patient’s liver.

The clinical evaluation showed a large potential and high acceptance for the VR environment in a medical context. The various applications confirmed that the framework is easily extended and can be used in real-time simulation as well as for the manipulation of complex anatomical structures.

IMHOTEP architecture overview. The framework builds on the Unity3D engine. It uses ITK to parse DICOM data, reads Blender3D surface meshes and communicates withVRhardware through OpenVR. The various data types can be manipulated by tools and visualized using various methods. Demanding tasks can be outsourced to separate threads. Asynchronous tasks can communicate across modules by firing events. Tested with Unity3D Version 2017.2, SimpleITK Version 1.0.1, OpenVR/SteamVR Version 1.2.1 and Blender3D Version 2.78