Computer Engineering & Science ›› 2025, Vol. 47 ›› Issue (7): 1285-1294.
• Graphics and Images • Previous Articles Next Articles
JI Lei,LI Xi,XU Dahong,LIU Hong,GUO Jianping
Received:
Revised:
Online:
Published:
Abstract: In classification-based pattern recognition tasks,the training process requires handling a large number of category samples.In practice,these samples exhibit a significant long-tailed distribution characteristic,posing substantial challenges for such tasks.The challenges brought by long-tailed distribution mainly manifest in two aspects:imbalanced feature space,and difficulty in focusing on hard samples in the tail regions.To address these issues,a category-aware semi-supervised knowledge distillation model is proposed,which comprises two core components:balanced semi-supervised knowledge distillation and balanced category-aware learning.The former employs semi-supervised knowledge distillation to achieve a more balanced feature space.The latter integrates a category-aware margin loss function with a delayed hard sample learning activation loss function,improving classifier performance and enhancing focus on hard samples.All experiments were conducted on five benchmark datasets,including CIFAR10-LT,CIFAR-100-LT,ImageNet-LT,iNaturalist2018,and Places-LT.Notably,on ImageNet-LT,the proposed model achieved a Top-1 accuracy of 57.5%,outperforming other models.
Key words: semi-supervised, category-aware, long-tailed classification
JI Lei, LI Xi, XU Dahong, LIU Hong, GUO Jianping. A category-aware semi-supervised knowledge distillation medel for long-tailed classification[J]. Computer Engineering & Science, 2025, 47(7): 1285-1294.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2025/V47/I7/1285