The full text of this article is unavailable through your IP address: 18.191.67.90
Contents Online
Statistics and Its Interface
Volume 16 (2023)
Number 3
Compressing recurrent neural network models through principal component analysis
Pages: 397 – 407
DOI: https://dx.doi.org/10.4310/22-SII727
Authors
Abstract
Currently, deep learning-based neural network models, such as recurrent neural networks (RNNs) and long shortterm memory (LSTM) architecture, are considered state-ofthe- art solutions to most of the problems associated with the effective execution of tasks in the field of natural language processing (NLP). However, a large number of parameters and significantly high memory complexity are required to ensure the effective application of such models, thereby increasing the difficulty of deploying such models in embedded systems, such as those used in mobile devices and tablets. In this study, we propose a technique for compressing RNN-based models through principal component analysis. Our proposed compression approach begins with the embedding layer, after which it progresses to the final output layer. For each target layer, we propose a principal component analysis approach for reducing the dimensions in the two-dimensional (2D) estimated weight matrix. Through this approach, we develop a reduced model structure with fewer parameters than those of the benchmark model. Additionally, our proposed approach ensures improved prediction accuracy compared to that of the benchmark model. Moreover, we propose a novel parameter-initialization method based on the score matrix of the principal component. We evaluate the effectiveness of our proposed method by conducting experiments on various NLP-related tasks, such as text classification and language translation, and datasets. The results of our experiments are significantly encouraging, as they pertain to the compression of RNN models through principal component analysis.
Keywords
model compression, principal component analysis, RNN compression
2010 Mathematics Subject Classification
Primary 68T50. Secondary 68U15, 68W25.
This research was supported by the Fundamental Research Funds for the Central Universities, and by the Research Funds of Renmin University of China, No. 21XNA027.
Received 24 November 2021
Accepted 3 February 2022
Published 14 April 2023