The Comprehensive Surgical Landscape Guidance System for Immersive Assistance in Minimally-invasive and Microscopic Interventions is intended to recognize the surgeon’s navigation process by means of an understandable, immersive visualization and interaction, to navigate him foresighted and to accompany him through the surgical procedure.
The aim of the COMPASS project is to develop an intelligent and cooperative assistance system that recognizes the surgeon’s individual navigation process and, through an understandable, immersive visualization and interaction, supports the surgeon in navigating through the surgical procedure with foresight.
Of the 15 million operations performed annually in Germany, approximately 15% were performed under endoscopic vision. Furthermore, in 43,000 endoscopic procedures a surgical navigation system was used.
Although being a de-facto standard in clinical care, minimally invasive surgery (MIC) is still challenging surgeons and operating technology around the world due to a limited field-of-vision, the decoupling of hand-eye coordination as well as navigation outside the immediate field-of-vision. Surgeons traditionally meet these requirements with training to increase their spatial orientation and flexible adaptation to patient-specific anatomies in simulated and real surgery settings. Current navigation systems only address this training state to a limited extent and behave primarily as passive manual tools in the operating room.
The aim of the COMPASS project is, therfore, to develop an intelligent and cooperative assistance system that recognizes the surgeon’s individual navigation process and, through an understandable, immersive visualization and interaction, supports the surgeon in navigating through the surgical procedure with foresight.
For this purpose, the COMPASS system integrates various functional components which, in their sum, enable the comprehensive reconstruction, description, classification and support of the individual navigation task. The reconstruction is based on a patient-specific “anatomical map” extracted from stereoendoscopic imaging. On this map, relevant information such as distinctive anatomical regions, directions or information on the surgical steps are located and adapted depending on the current in-situ orientation and position of the endoscope. The surgeon, thus, uses a navigation with a constantly adapting patient-specific map. Similar to a GPS map in the car, where the route planning adapts to the current traffic situation, the surgeon can interact with the navigation system according to the current surgical steps and retrieve context-sensitive information. In addition, the system implements anatomyrelated geometric measurement functions, such as wall distances or defect volumes, by means of spatial image acquisition. With the help of a documentation interface, all relevant mapping data is stored in a standardized format and made available for further use. The resulting system realizes a comprehensive and and purposeful human-technology interaction for minimally invasive surgery.
1. Multimodal acquisition of the endoscope view by sensor extension, 3D endoscopy and 3D reconstruction of anatomical structures. The system follows the surgeon’s receptivity.
2. Machine-understandable mapping of the surgical processes through semantic descriptions of the orientation and navigation data. The system follows the surgeon’s cognition.
3. Redefinition of the role of navigation systems with comprehensive mixed-reality visualizations of spatially- and temporally-associated anatomical maps. The system extends the surgeon’s receptivity.
4. Extension of the assistance principle through novel navigational and value-added functions based on the provided imaging and sensor technology. The system expands the surgeon’s range of actions.
5. Introduction of new interaction concepts for cooperative navigation assistance based on a situation-specific interaction management together with dynamic configuration profiles. The system reacts to the cognitive workload of the surgeon.
Will be announced soon!