Computer Vision News - September 2022
28 Explainability in AI Tounderstandthedecision-makingprocess inside a neural networks with imaging data , several approaches can be taken. Occlusion Sensitivity requires obscuring areas within the image-based inputs, and learning which area is obscured alters the final decision. This enables pin-pointing to the anatomical region, which leads to the classification. However, this method requires significant computational effort and is highly dependent on the window size. Additionally, regional occlusions may obscure larger scale explanations that aggregate multiple inputs together. Another method is called Grad-CAM . Implementations of Grad-CAM sample the network’s gradients at some deep layer Artificial intelligence (AI) and Deep Learning methods are pervasively used to improve decision making processes throughout multiple industries. In the medical field, these tools assist physicians in diagnosing patients in every aspect of medicine. In Computer Vision, Deep Learning is used to train neural networks to classify medical conditions based on imaging data, including all imaging modalities. The downside of neural networks is that they behave like a “black-box”. The algorithm can perform a given task accurately, but themechanismof operation is hidden, and we do not always knowwhat considerations lead to a prediction. This inhibits acceptance of AI-driven classifiers and raises doubts among users. This issue affects the clinical settings specifically, as mistakes can cost human lives, and physician confidence is key. Assume a trained classifier which gets as input a video of a surgical scene. The output of this classifier is “success” in case the surgery was conducted properly, or “fail” if there was an error during the procedure. This classifier could potentially reduce errors that were not detected by the surgical staff by alerting in real time. However, understanding how this decision was derived may assist in prevention of future errors. EXPLAINABILITY WHEN AI LEARNS THE UNEXPECTED
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=