Computer Engineering & Science
Previous Articles Next Articles
LIU Bo1,LIU Xiaoguang1,WANG Gang1,WU Di2
Received:
Revised:
Online:
Published:
Abstract:
There is an enormous number of training data being generated in Headlines Today's sever. These data is formatted for Machine Learning. We observed that whichever common data compression method cannot perfectly satisfy business requirements: a better compression ratio. We present two methods for training data from Headlines Today’s sever. One is called hierarchical cluster compression (HCC), and the other is hash recoding compression (HRC). The HCC with Gzip Compression can quadruple the compression speed than pure Gzip Compression, which indicates that the first proposed method can effectively promote compression speed and guarantee the compression ratio as well; the HRC with Snappy Compression is able to halve the compression ratio in comparison with pure Snappy Compression, which shows that the HRC can reduce the compression ratio and lower the compression speed as little as possible. Above all, it is meaningful to choose whichever method for decreasing operation costs, promoting business processes efficiency and providing better user experience.
Key words: hierarchical cluster compression, Hash recoding compression, dictionary compression, training data, Gzip, Snappy
LIU Bo1,LIU Xiaoguang1,WANG Gang1,WU Di2. Two data compression methods for recommender systems[J]. Computer Engineering & Science.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2016/V38/I11/2183