Residential College | false |
Status | 已發表Published |
MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction | |
Jingjie Guo1,2; Kelang Tian1; Kejiang Ye1; Cheng-Zhong Xu3 | |
2021-05 | |
Conference Name | 25th International Conference on Pattern Recognition, ICPR 2020 |
Source Publication | Proceedings - International Conference on Pattern Recognition |
Pages | 3605 - 3611 |
Conference Date | 10-15 Jan. 2021 |
Conference Place | Milan |
Country | Italy |
Publication Place | IEEE COMPUTER SOC, 10662 LOS VAQUEROS CIRCLE, PO BOX 3014, LOS ALAMITOS, CA 90720-1264 USA |
Publisher | IEEE |
Abstract | With the improvement of data volume, computing power and algorithms, deep learning has achieved rapid development and showing excellent performance. Recently, many deep learning models are proposed to solve the problems in different areas. A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes form a directed graph along a temporal sequence. This allows it to exhibit temporal dynamic behavior, which makes it applicable to tasks such as handwriting recognition or speech recognition. However, the RNN relies heavily on the automatic learning ability to update parameters that concentrate on the data flow but seldom considers the feature extraction capability of the gate mechanism. In this paper, we propose a novel architecture to build the forget gate which is generated by multiple bases. Instead of using the traditional single-layer fully-connected network, we use a Multiple Attention (MA) based network to generate the forget gate which refines the optimization space of gate function and improve the granularity of the recurrent neural network to approximate the map in the ground truth. Due to the benefit of MA structure on the gate mechanism, the proposed MA-LSTM model achieves better feature extraction capability than other known models. |
Keyword | Multi-attention Traffic Prediction Handwriting Recognition Language Model |
DOI | 10.1109/ICPR48806.2021.9412402 |
URL | View the original |
Indexed By | CPCI-S |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering ; Imaging Science & Photographic Technology |
WOS Subject | Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic ; Imaging Science & Photographic Technology |
WOS ID | WOS:000678409203094 |
Scopus ID | 2-s2.0-85110420487 |
Fulltext Access | |
Citation statistics | |
Document Type | Conference paper |
Collection | THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU) Faculty of Science and Technology |
Corresponding Author | Kejiang Ye |
Affiliation | 1.Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences 2.Sun Yat-sen University 3.State Key Lab of IoTSC, Faculty of Science and Technology, University of Macau |
Recommended Citation GB/T 7714 | Jingjie Guo,Kelang Tian,Kejiang Ye,et al. MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction[C], IEEE COMPUTER SOC, 10662 LOS VAQUEROS CIRCLE, PO BOX 3014, LOS ALAMITOS, CA 90720-1264 USA:IEEE, 2021, 3605 - 3611. |
APA | Jingjie Guo., Kelang Tian., Kejiang Ye., & Cheng-Zhong Xu (2021). MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction. Proceedings - International Conference on Pattern Recognition, 3605 - 3611. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment