41 Computer Vision News defined as the intersection point between the gamma probe axis and the tissue surface in 3D space but then projected onto the 2D laparoscopic image. It’s not trivial to determine this using traditional methods due to the lack of textural definition of tissues and per-pixel ground truth depth data. Also, it’s challenging to acquire the probe pose during the surgery. To address this challenge, Baoru redefined the problem from locating the intersection point in 3D space to finding it in 2D. “The problem is to infer the intersection point between probe access and the tissue surface,” she continues. “To provide the sensing area visualization ground truth, we modified a non-functional SENSEI probe by adding a DAQ-controlled cylindrical miniaturized laser module. This laser module emitted a red beam visible as red dots on the tissue surface to optically show the sensing area on the laparoscopic images, which is also the probe axis and the tissue surface intersection point. This way, we can keep the adapted tool visually identical to the real probe by inserting a laser module inside. We did no modification to the probe shell itself.” Baoru’s solution involves a multi-faceted approach. Firstly, she modified the probe. Then, she built a hardware platform for data collection and a software platform for the learning algorithm to facilitate the final sensing area detection results. With this setup, it is possible to find the laser module on the tissue surface, but the red dot is too weak compared with the laparoscope light. To solve this, she used a shutter system to control the laparoscope’s illumination, closing it when the laser is turned on and opening it when it is turned off. This Detecting the Sensing Area of a …
RkJQdWJsaXNoZXIy NTc3NzU=