img

Notice détaillée

Self-distillation object segmentation via pyramid knowledge representation and transfer

Article Ecrit par: Zheng, Yunfei ; Sun, Meng ; Wang, Xiaobing ; Cao, Tieyong ; Zhang, Xiongwei ; Xing, Lixing ; Fang, Zheng ;

Résumé: The self-distillation methods can transfer the knowledge within the network itself to enhance the generalization ability of the network. However, due to the lack of spatially refined knowledge representations, current self-distillation methods can hardly be directly applied to object segmentation tasks. In this paper, we propose a novel self-distillation framework via pyramid knowledge representation and transfer for the object segmentation task. Firstly, a lightweight inference network is built to perform pixel-wise prediction rapidly. Secondly, a novel self-distillation method is proposed. To derive refined pixel-wise knowledge representations, the auxiliary self-distillation network via multi-level pyramid representation branches is built and appended to the inference network. A synergy distillation loss, which utilizes the top-down and consistency knowledge transfer paths, is presented to force more discriminative knowledge to be distilled into the inference network. Consequently, the performance of the inference network is improved. Experimental results on five datasets of object segmentation demonstrate that the proposed self-distillation method helps our inference network perform better segmentation effectiveness and efficiency than nine recent object segmentation network. Furthermore, the proposed self-distillation method outperforms typical self-distillation methods. The source code is publicly available at https://github.com/xfflyer/SKDforSegmentation.


Langue: Anglais