Computer Vision News Computer Vision News 26 CARS 2024 Presentation by Negar Kazemipour For decades, clinicians have been using 2D slices of medical images to analyze patients' anatomy and plan surgeries. Progress in graphics allowed clinicians to benefit from 3D visualizations, providing them with a more comprehensive and intuitive representation of patient data. However, interacting with 3D patient data through 2D screens and input systems is challenging. Therefore, to provide a better spatial understanding of the anatomical data and degrees of freedom to input systems, devices like haptics force feedback tools, and MR devices are becoming increasingly studied for surgical planning. The usability analysis of the above technologies in various surgical planning contexts remains largely unexplored. In this work, we compared surgical planning using a Touch-x force feedback haptics device, and a Microsoft HoloLens 2 with conventional planning systems using a 2D mouse monitor, and keyboard. Negar Kazemipour graduated with a master's degree in computer science from Marta Kersten-Oertel’s Applied Perception Lab at Concordia University in Montreal, Canada. Her master's research was about looking at how MR and haptics can be used in surgical planning. Currently, she works as a software engineer at Zimmer Biomet, developing mixed reality (MR) surgical applications. She recently presented the results of her research at the Computer Assisted Radiology and Surgery - CARS 2024 Conference. It was also published in IJCARS. A Usability Analysis of Augmented Reality and Haptics for Surgical Planning
RkJQdWJsaXNoZXIy NTc3NzU=