Computer Vision News - August 2021

MIDL Best Paper 18 Beyond pixel-wise supervision: semantic segmentation with higher-order shape descriptors Hoel Kervadec is a postdoc researcher at CRCHUM in Montréal. His work proposing a new approach to semantic segmentation has just won the coveted Best Paper award at MIDL 2021, which recognizesthehighestqualityfull-length paper presented at the conference. We speak to him to find out more about it. In this paper, Hoel and the team explore a way to supervise a network by describing where an object of interest should be and what shape it should have , instead of micromanaging the network output by telling it the label of each pixel in an image. Existing loss functions used to train networks, like cross-entropy or dice loss, treat image segmentation as independent pixel-wise classification and do not take the image space into account. You could shuffle the pixels and get the same computed value because it is not affected by the shape of the object. “We compute the location of the predicted segmentation and some descriptors of its shape ,” Hoel explains. “ Supervision based on a description of the object rather than the pixels can be reused more easily across scans or patients in the future . However, with pixel-wise supervision for a specific scan, you might need to redo the complete annotation .” This opens up new ways to supervise networks and use labels . For example, one could directly define a problem by encoding anatomical information about it, rather than relearning what is already known through pixel-wise annotation. The work has been three years in the making and not without its challenges. “ It is an extension of our previous work on constrained deep networks , but it took some time to reframe the problem, ” Hoel tells us. “We went back and forth between a few papers, which were ultimately all related to the same application and methodologies. We needed to improve on the supervision method with the constraints and it took us some time Hoel Kervadec Best of M I D L

RkJQdWJsaXNoZXIy NTc3NzU=