Analysing Potential of ResNet for Transfer Learning with Stochastic Depth

The research examined the comparative transfer learning capability between ResNet with stochastic depth and regular ResNet. Our study examined the robustness of ResNet models with stochastic depth when subjected to a common transfer learning technique: pruning the final layers. Our hypothesis claims that implementing the stochastic depth training approach is a preventive measure against co-adaptation among sequential layers. Consequently, this prevention leads to improved performance when several layers undergo pruning. This study aimed to validate our theory by employing the pre-trained ResNet50 model architecture with stochastic and constant depth variations. Two well-used datasets, Oxford-IIIT Pet and CIFAR-10, were utilized for this purpose. It was observed that the ResNet50 model with stochastic depth consistently exhibited superior performance compared to the regular ResNet50 model when the pruning of the last layers was conducted from the second layer of the fourth block to the second layer of the third block. This observation was made for both the CIFAR-10 and Oxford-IIIT Pet datasets. Furthermore, it was noted that there was no significant decline in the performance of the pruned ResNet50 model with stochastic depth, even when the last layers were pruned extensively up to the second layer of the third block.

Authors:
Calvin Linardy Candra, Anderies, Donghai Guan, Tjeng Wawan Cenggoro, Bens Pardamean

Big Data and Security

Read Full Article