MICCAI 2018 Daily - Wednesday
8 Wednesday Oral Presentation Their work is a flexible and dynamic calibration between a C-arm that is used interventionally and an augmented reality environment , using an optical see-through head mounted display, that provides virtual content to the surgeon. Mathias tells us that the novelty of this is that they have a completely marker- less, radiation-free and dynamic tracking paradigm that very effectively closes the calibration loop between the patient, the imaging modality, and the surgeon. The main challenge is the calibration of the inside-out tracker that is attached to the C-arm, relative to the C-arm camera, because the sensors do not have overlap in their field of view. Jonas explains how they solved this: “ We used the hand-eye calibration between the sensor on the C-arm and the X-ray source. With the help of cone-beam CT data, we had access to a known trajectory of the X-ray source and could therefore calibrate the tracker on the C-arm to the X-ray source. ” Mathias adds: “ Essentially, one of the biggest challenges in providing augmented reality visualisation to the surgeon in image-guided procedures nowadays is how to elegantly and flexibly calibrate this environment to the imaging modality, such that we can in fact link these two different domains. This system that we have developed here is very elegant, because if we have cone-beam CT interventionally, then we do not actually Closing the Calibration Loop: An Inside-out-tracking Paradigm for Augmented Reality in Orthopedic Surgery We speak to Mathias Unberath, a postdoc at Johns Hopkins University; Jonas Hajek, a student at Friedrich-Alexander University; and Greg Osgood, an orthopaedic surgeon at Johns Hopkins University, about their joint work (left to right in the photo). They invite anyone who wants to play with HoloLens … to come and play
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=