• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

J4 ›› 2015, Vol. 37 ›› Issue (04): 649-656.

• 论文 • 上一篇    下一篇

基于低秩稀疏评分的非监督特征选择

杨国亮, 谢乃俊,王艳芳, 梁礼明   

  1. (江西理工大学电气工程与自动化学院,江西 赣州 341000)
  • 收稿日期:2014-04-18 修回日期:2014-06-03 出版日期:2015-04-25 发布日期:2015-04-05
  • 基金资助:

    国家自然科学基金资助项目(51365017,61305019);江西省科技厅青年科学基金资助项目(20132bab211032)

Unsupervised feature selection based on low rank and sparse score 

YANG Guoliang,XIE Naijun,WANG Yanfang,LIANG Liming   

  1. (School of Electrical Engineering and Automation,Jiangxi University of Science and Technology,Ganzhou 341000,China)
  • Received:2014-04-18 Revised:2014-06-03 Online:2015-04-25 Published:2015-04-05

摘要:

在处理高维数据过程中,特征选择是一个非常重要的数据降维步骤。低秩表示模型具有揭示数据全局结构信息的能力和一定的鉴别能力。稀疏表示模型能够利用较少的连接关系揭示数据的本质结构信息。在低秩表示模型的基础上引入稀疏约束项,构建一种低秩稀疏表示模型学习数据间的低秩稀疏相似度矩阵;基于该矩阵提出一种低秩稀疏评分机制用于非监督特征选择。在不同数据库上将选择后的特征进行聚类和分类实验,同传统特征选择算法进行比较。实验结果表明了低秩特征选择算法的有效性。

关键词: 低秩表示, 稀疏约束项, 低秩稀疏评分, 特征选择

Abstract:

Feature selection is an important reduce dimensional step in dealing with high dimensional data. Low-rank representation model shows a very good discriminative ability and can capture the global structure of the data. Sparse representation model can reveal the true intrinsic structure information with fewer connection relationships. Based on the low-rank representation model, sparse constraint items are added to construct a low-rank and sparse representation model, which is used to learn the low-rank and sparse affinity matrix between the data. Then with the obtained affinity matrix, we propose an unsupervised feature selection based on Low-Rank and Sparse Score (LRSS). After the clustering and classification of the selected features on different databases  we compare our proposal with traditional feature selection algorithms. Experimental results verify the effectiveness of our method, and show that our proposal outperforms the state-of-art feature selection approaches.

Key words: low-rank representation;sparse constrains;low-rank and sparse score;feature select