Computer Vision News - October 2020
11 To further study the difference between medical and natural images, the authors in the paper display both deep representations and attention maps, "critical regions in the input image that mostly activate the network output" that were computed with the Grad-CAM technique. The figure above shows that the attentions of the DNN models are heavily disrupted by adversarial perturbations. But, while “on natural images the attentions are only shifted to less important regions, on medical ones, the attentions are shifted from the lesion to regions that are completely irrelevant to the diagnosis of the lesion” . This is an important hint to understand why adversarial attacks are easier to detect on medical images. Conclusions This paper focuses on investigating the problem of adversarial attacks in deep learning based medical image analysis, through a series of experiments using benchmark datasets. It leads to two main conclusions, which are summarised by the statements below: 1. Adversarial attacks on medical images can succeed more easily than those on natural ones 2. They can also be detected more easily As mentioned in the paper, studies of this type could be a “useful basis to approach the design of more explainable and secure medical deep learning systems” , and it’s desirable that we have much more of those in the near future in order to safely advance in this field. Such works, which take a step back to reflect on what we already have to analyse it better, are at this point fundamental. For example, it appears from this research that more complex architectures, although improving the prediction performance, can also make the whole system more vulnerable to adversarial attacks. In conjunction with these powerful DNNs, it emerges that researchers need to focus on regularizations and training strategies that can make their model’s defences towards adversarial attacks more robust. Understanding Adversarial Attacks on Deep Learning Based Medical Image Analysis Systems
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=