• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

J4 ›› 2008, Vol. 30 ›› Issue (11): 129-133.

• 论文 • 上一篇    下一篇

基于分组的次数与规则剪枝相结合的语言模型压缩方法研究

吴晓春 吴娴 李培峰 朱巧明   

  • 出版日期:2008-11-01 发布日期:2010-05-19

  • Online:2008-11-01 Published:2010-05-19

摘要:

由于庞大的训练语料,统计语言模型的大小往往会超出手持设备的存储能力。随着现阶段资源受限设备的迅速发展,语言模型的压缩研究也就显得更加重要。本文提出了一个 语言模型压缩方法,即将次数剪切与规则剪枝方法相结合,并使用分组的方法保证在不减少单元数目的情况下压缩模型。文章对使用新的算法得到的语言模型与次数剪切和规规则剪枝方法分别进行困惑度比较。实验结果表明,使用新方法得到的语言模型性能更好。

关键词: 语言模型压缩 次数剪切 规则剪枝 分组 困惑度

Abstract:

Currently the size of most statistical language models based on large-scale training corpus always goes beyond the storage ability of many handheld de vices. With the rapid development of the limited resource devices, the research on language model compression can meet such requirements. This paper pro poses a language model compression method which combined the count cutoff and the pruning method to reduce the size of the language model and uses group ing to compress this model without cell reduction. Our experimental results show that our method can achieve higher perplexity than those of other metho   ds based on the same size.

Key words: language model compression, count cutoff, rule pruning, grouping, perplexity