Computer Vision News - July 2022
44 MedTech Research used for semantic representation. It shows without residual connections can fail to learn features correctly and how to remedy the issue. But let’s now analyse the specific elements of the proposed deep learning network. Batch Normalization (BN) is used to expedite the training. It is a method used to make training of artificial neural networks faster and more stable through normalization of the layers' inputs by re-centering and re-scaling. A rectified linear unit does not squeeze the input value, which helps minimize the effect of the vanishing gradient problem. Dropout is a strategy often used with supervised learning algorithms. Specifically, it is a regularization technique for reducing overfitting in artificial neural networks by preventing complex co- adaptations on training data. Global average pooling makes an extreme dimensionality reduction by transforming the entire size to one dimension, helping to reduce the effect of over-fitting. Themodel startswith a traditional convolutional layer followed by a BN layer and ReLU layer . The previous layer is followed by eight blocks of parallel convolutional layers that have four distinct filter sizes (1x1, 3x3, 5x5, and 7x7). The output of these four layers is concatenated and integrated into the following block to create the final model. The convolutional layers in MedNet are followed by BN and ReLU layers. Just a reminder here for different applications, such as remote sensing there has been propose a fully connected dense connectivity pattern layer shown below Back to the MedNet though. There are twelve connections between the blocks of the convolutional layers. These connections maintain the ability of the model to maintain different levels of features for thepurposeof achieving a better representationof them. Both parallel convolutions and the connections are extremely important for gradient propagation
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=