Table of Content

Open Access iconOpen Access

ARTICLE

Parameters Compressing in Deep Learning

Shiming He1, Zhuozhou Li1, Yangning Tang1, Zhuofan Liao1, Feng Li1, *, Se-Jung Lim2

1 School of Computer and Communication Engineering, Hunan Provincial Key Laboratory of Intelligent Processing of Big Data on Transportation, Changsha University of Science and Technology, Changsha, 410114, China.
2 Liberal Arts & Convergence Studies, Honam University, Gwangju, 62399, Korea.

* Corresponding Author: Se-Jung Lim. Email: email.

Computers, Materials & Continua 2020, 62(1), 321-336. https://doi.org/10.32604/cmc.2020.06130

Abstract

With the popularity of deep learning tools in image decomposition and natural language processing, how to support and store a large number of parameters required by deep learning algorithms has become an urgent problem to be solved. These parameters are huge and can be as many as millions. At present, a feasible direction is to use the sparse representation technique to compress the parameter matrix to achieve the purpose of reducing parameters and reducing the storage pressure. These methods include matrix decomposition and tensor decomposition. To let vector take advance of the compressing performance of matrix decomposition and tensor decomposition, we use reshaping and unfolding to let vector be the input and output of Tensor-Factorized Neural Networks. We analyze how reshaping can get the best compress ratio. According to the relationship between the shape of tensor and the number of parameters, we get a lower bound of the number of parameters. We take some data sets to verify the lower bound.

Keywords


Cite This Article

S. He, Z. Li, Y. Tang, Z. Liao, F. Li et al., "Parameters compressing in deep learning," Computers, Materials & Continua, vol. 62, no.1, pp. 321–336, 2020. https://doi.org/10.32604/cmc.2020.06130

Citations




cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2253

    View

  • 1466

    Download

  • 0

    Like

Related articles

Share Link