Computer Vision News - February 2018
To verify the gain in accuracy of the CNN with L2-SVM over the CNN with softmax was indeed due to the loss function and not to network optimization, the author checked what objective function score each of the two networks produced for the other loss function as well for the loss function it was optimizing for. The table below shows the networks did not produce good results for the other loss function - therefore the loss functions should be credited (blamed) for the difference in accuracy. Moreover, when the author took the weights of the CNN+L2-SVM that achieved the 11.9% error rate, and initialized a CNN+softmax with them, further training deteriorated the error rate towards 14%. Conclusion: The author showed that a CNN network with L2-SVM loss function outperformed softmax on 2 popular benchmark classification datasets. This suggests developers should evaluate the performance of softmax and SVM before selecting which one to use, all the more so because switching between them is not complicated. Further evidence that SVM performs better than softmax may be found in other articles, including “DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition”, by Jeff Donahue et al. Donahue’s team tested the classification capabilities of AlexNet layers 5, 6 and 7 (each on its own) on a variety of different databases (different from those on which AlexNet was originally trained) -- evaluating their performance using either softmax or SVM as the loss function -- and showed that at least in some cases SVM was preferable . Computer Vision News Research 7 Research A CNN network with L2-SVM loss function outperformed softmax on 2 popular benchmark classification datasets. This suggests that developers should evaluate the performance of softmax and SVM before selecting which one to use SVM trained conv net appears to have more textured filters! ConvNet+Softmax ConvNet+SVM Test error 14.0% 11.9% Avg. cross entropy 0.072 0.353 Hinge loss squared 213.2 0.313
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=