Our team has followed with deep attention the recent Hamlyn Symposium. In particular, we have enjoyed the great presentation by Professor Laura Marcu from UC Davis about fluorescence imaging during surgery, using that modality as a surgical aid.
Fluorescence imaging is very widely used in microscopy: you inject a specific dye into the specimen, and upon chemical interaction with specific target molecules they dye “shines” at a specific wavelength. This is used to detect the presence and quantity of the target molecule. It is used for cancer detection: you can create a fluorescence molecule that attaches to cancer itself and when you inject it, it will connect to the tumor; then, upon fluorescence imaging of the sample, it will highlight the tumor.
This works very well in lab conditions, but during surgery it is more challenging, because you continually record the scene, there is light in all wavelengths, you need a specific detector for the wavelength you are looking for. The developed system allows for fluorescence imaging during the surgery and it allows to find the tumor even when it is slightly beneath the surface we are looking at.
This is very similar to a common practice in GI surgery: indocyanine green (ICG). It is a fluorescent molecule used to stain blood, tumors, or lymph nodes. When you detect the specific wavelength of this molecule after injection into the bloodstream, you can see where the blood vessels are and then be able to avoid cutting them when that is not needed. This could potentially be used for tumor detection, since there are more blood vessels in tumor areas.
The surgeon can use a specific visual equipment to detect the fluorescent dye they are using, or even gamma radiation to see similar things: a molecule that connects to target tissue in the body.
This solution is not simple: it needs special hardware equipment, a second camera and more. The suggested system allows real-time imaging of the surgery, in addition to the fluorescence imaging. The AI components of this system register the fluorescence imaging to the real-time video that is acquired during the surgery.
The green spot at the center of the image is the fluorescence imaging overlaid on regular real-time video. The surgeon now knows where the tumor is and can decide what needs to be cut out and how to execute that specific procedure. The same is done with ICG: you could potentially overlay the ICG image onto the real-time feed and see the blood vessels in the field of view and avoid them. The same concept could be applied to gamma radiation views, which can be overlaid on the real time view and be used as a surgical aid during surgery.
A similar technology which is used is hyperspectral imaging: with a specific hardware in place, it is easy to acquire images at different wavelengths and use the technology to detect blood vessels, which are hidden beneath the surface of the tissue. If you overlay that with the real-time images coming from the surgery, the surgeon gets precious information about blood vessels or the presence of a clump of blood vessels, which can be a tumor, and more.
Our recommendation is – once specific hardware equipment is in place – to use AI algorithms with deep learning to overlay in real-time, with no delay, assisting images onto the real image, providing the surgeon an optimal viewpoint and optimal conditions to perform the surgery.
These different modalities are currently separated – nobody uses them together. In a way, hyperspectral is even competing with ICG. But from an AI point of view, the advancement of each of these modalities (including the fluorescence imaging) in the surgery is a similar concept, which requires taking an image which is not in the video feed and overlay it into the video feed. In addition to the fusion / registration, we also think that image enhancement could and should be performed at some point in the process. It is possible to combine traditional image enhancement solutions with advanced deep learning algorithms; that should be done in the background, prior to the actual registration with real time image.
It is obviously important to have access to relevant data: ICG being widely used also in robotic surgery and obtaining ICG data should be accessible as any other medical imaging data. Hyperspectral and fluorescence are not as common and therefore data is not as accessible. Gamma is further down the line. We must be aware of all of these modalities, in case any of them catches up with ICG. Regardless, RSIP Vision’s recommendation is to make the first step with standard ICG image registration onto the video feed. Once this is tailor-developed and ready to go, it is possible to adapt it to other procedures and modalities.