Computer Vision News - August 2020

Challenge: SARAS 16 The €4.3m Smart Autonomous Robotic Assistant Surgeon (SARAS) project is a consortium of nine partners led by the University of Verona in Italy. We speak to Fabio Cuzzolin, Professor of Artificial Intelligence and Director of the Visual Artificial Intelligence Lab at Oxford Brookes, about his work on SARAS and find out more about the SARAS endoscopicvisionchallengefor surgeon action detection (SARAS-ESAD). The aim of SARAS is to introduce full autonomy in surgical robotics for the first time to replace the role of the assistant surgeon in laparoscopy . It has a focus on two specific procedures: radicalprostatectomyandnephrectomy. Oxford Brookes University is in charge of the cognitive and AI components like recognising what the surgery is doing from endoscopic videos, predicting future surgeon actions, and understanding the surgical scene by segmenting it into organs and blood, for example. The purpose of SARAS is to develop three platforms. In the first platform, the main surgeon and assistant surgeon work in a shared environment by means of a telerobotics set-up . The second platform is about doing radical prostatectomy with the main surgeon entirely operating using a giant da Vinci robot and the SARAS arms playing the role of the assistant surgeon in a fully autonomous way. The surgeon is able to issue verbal comments to the arms to stop them or direct them if necessary. The third platform is about validating nephrectomy in smaller hospitals . The main surgeon operates manually using standard tools – not the da Vinci robot – and the SARAS arms again work fully autonomously to assist the main surgeon. “One of the main tasks is to recognise from the endoscopic video what the surgeon is doing with their tools,” Fabio explains. “We at Oxford Brookes in the visual AI lab are experts in so-called action detection, which is detecting human actions from streaming online SARAS endoscopic vision challenge for surgeon action detection (SARAS-ESAD) Fabio Cuzzolin Best of MIDL 2020

RkJQdWJsaXNoZXIy NTc3NzU=