Residential Collegefalse
Status已發表Published
MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction
Jingjie Guo1,2; Kelang Tian1; Kejiang Ye1; Cheng-Zhong Xu3
2021-05
Conference Name25th International Conference on Pattern Recognition, ICPR 2020
Source PublicationProceedings - International Conference on Pattern Recognition
Pages3605 - 3611
Conference Date10-15 Jan. 2021
Conference PlaceMilan
CountryItaly
Publication PlaceIEEE COMPUTER SOC, 10662 LOS VAQUEROS CIRCLE, PO BOX 3014, LOS ALAMITOS, CA 90720-1264 USA
PublisherIEEE
Abstract

With the improvement of data volume, computing power and algorithms, deep learning has achieved rapid development and showing excellent performance. Recently, many deep learning models are proposed to solve the problems in different areas. A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes form a directed graph along a temporal sequence. This allows it to exhibit temporal dynamic behavior, which makes it applicable to tasks such as handwriting recognition or speech recognition. However, the RNN relies heavily on the automatic learning ability to update parameters that concentrate on the data flow but seldom considers the feature extraction capability of the gate mechanism. In this paper, we propose a novel architecture to build the forget gate which is generated by multiple bases. Instead of using the traditional single-layer fully-connected network, we use a Multiple Attention (MA) based network to generate the forget gate which refines the optimization space of gate function and improve the granularity of the recurrent neural network to approximate the map in the ground truth. Due to the benefit of MA structure on the gate mechanism, the proposed MA-LSTM model achieves better feature extraction capability than other known models.

KeywordMulti-attention Traffic Prediction Handwriting Recognition Language Model
DOI10.1109/ICPR48806.2021.9412402
URLView the original
Indexed ByCPCI-S
Language英語English
WOS Research AreaComputer Science ; Engineering ; Imaging Science & Photographic Technology
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic ; Imaging Science & Photographic Technology
WOS IDWOS:000678409203094
Scopus ID2-s2.0-85110420487
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionTHE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU)
Faculty of Science and Technology
Corresponding AuthorKejiang Ye
Affiliation1.Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences
2.Sun Yat-sen University
3.State Key Lab of IoTSC, Faculty of Science and Technology, University of Macau
Recommended Citation
GB/T 7714
Jingjie Guo,Kelang Tian,Kejiang Ye,et al. MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction[C], IEEE COMPUTER SOC, 10662 LOS VAQUEROS CIRCLE, PO BOX 3014, LOS ALAMITOS, CA 90720-1264 USA:IEEE, 2021, 3605 - 3611.
APA Jingjie Guo., Kelang Tian., Kejiang Ye., & Cheng-Zhong Xu (2021). MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction. Proceedings - International Conference on Pattern Recognition, 3605 - 3611.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Jingjie Guo]'s Articles
[Kelang Tian]'s Articles
[Kejiang Ye]'s Articles
Baidu academic
Similar articles in Baidu academic
[Jingjie Guo]'s Articles
[Kelang Tian]'s Articles
[Kejiang Ye]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Jingjie Guo]'s Articles
[Kelang Tian]'s Articles
[Kejiang Ye]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.