• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2025, Vol. 47 ›› Issue (7): 1285-1294.

• Graphics and Images • Previous Articles     Next Articles

A category-aware semi-supervised knowledge distillation medel for long-tailed classification

JI Lei,LI Xi,XU Dahong,LIU Hong,GUO Jianping   

  1. (College of Information Science and Engineering,Hunan Normal University,Changsha 410081,China)
  • Received:2024-02-27 Revised:2024-05-02 Online:2025-07-25 Published:2025-08-25

Abstract: In classification-based pattern recognition tasks,the training process requires handling a large number of category samples.In practice,these samples exhibit a significant long-tailed distribution characteristic,posing substantial challenges for such tasks.The challenges brought by long-tailed distribution mainly manifest in two aspects:imbalanced feature space,and difficulty in focusing on hard samples in the tail regions.To address these issues,a category-aware semi-supervised knowledge distillation model is proposed,which comprises two core components:balanced semi-supervised knowledge distillation and balanced category-aware learning.The former employs semi-supervised knowledge distillation to achieve a more balanced feature space.The latter integrates a category-aware margin loss function with a delayed hard sample learning activation loss function,improving classifier performance and enhancing focus on hard samples.All experiments were conducted on five benchmark datasets,including CIFAR10-LT,CIFAR-100-LT,ImageNet-LT,iNaturalist2018,and Places-LT.Notably,on ImageNet-LT,the proposed model achieved a Top-1 accuracy of 57.5%,outperforming other models.

Key words: semi-supervised, category-aware, long-tailed classification