J4 ›› 2015, Vol. 37 ›› Issue (04): 649-656.
• 论文 • Previous Articles Next Articles
YANG Guoliang,XIE Naijun,WANG Yanfang,LIANG Liming
Received:
Revised:
Online:
Published:
Abstract:
Feature selection is an important reduce dimensional step in dealing with high dimensional data. Low-rank representation model shows a very good discriminative ability and can capture the global structure of the data. Sparse representation model can reveal the true intrinsic structure information with fewer connection relationships. Based on the low-rank representation model, sparse constraint items are added to construct a low-rank and sparse representation model, which is used to learn the low-rank and sparse affinity matrix between the data. Then with the obtained affinity matrix, we propose an unsupervised feature selection based on Low-Rank and Sparse Score (LRSS). After the clustering and classification of the selected features on different databases we compare our proposal with traditional feature selection algorithms. Experimental results verify the effectiveness of our method, and show that our proposal outperforms the state-of-art feature selection approaches.
Key words: low-rank representation;sparse constrains;low-rank and sparse score;feature select
YANG Guoliang,XIE Naijun,WANG Yanfang,LIANG Liming. Unsupervised feature selection based on low rank and sparse score [J]. J4, 2015, 37(04): 649-656.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2015/V37/I04/649