• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2022, Vol. 44 ›› Issue (05): 894-900.

• Artificial Intelligence and Data Mining • Previous Articles     Next Articles

Sparse autoencoder based on earth mover distance

FAN Yun   

  1. (School of Politics,National Defense University,Shanghai 200433,China)
  • Received:2020-07-05 Revised:2020-12-04 Accepted:2022-05-25 Online:2022-05-25 Published:2022-05-24

Abstract: KL divergence is adopted widely in the field of machine learning to measure distances between distributions in model loss function. In the sparse autoencoder, the KL divergence is used as the penalty term of the loss function to measure the distance between the neuron output and the sparse parameter, so that the neuron output approaches the sparse parameter, thereby suppressing the activation of the neuron to obtain sparse coding. In WGAN, Wasserstein distance is used to solve the gradient va- nishing and mode collapse problems of GAN, making the training of GAN more stable. The experimental results show that, compared with the sparse autoencoder using KL divergence and JS divergence, the sparse autoencoder using EMD distance as a penalty term can reduce the reconstruction error between real samples and reconstructed samples. As the penalty parameter increases, the encoding becomes more sparse. 

Key words: sparse autoencoder, regularization, earth mover distance, KL divergence