Computer Vision News - June 2018
Matrix factorization: is used to aggregate activations along channels and spatial areas. This reduces the overwhelming number of neurons to a small set of groups, distilling the process and power of the neural network’s deep learning. The figure below demonstrates factorization at two consecutive network layers. The first layer factorized into 8 neuron-groups, the second into 6. For the label ‘labrador retriever’, it shows the influence of each neuron group at each layer towards the image’s classification as that label. Figures in this last part are from this link , where you can also find further reading on interpretability. 17 Tool Computer Vision News for Deep Learning in TensorFlow and Keras
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=