Residential Collegefalse
Status已發表Published
Sentence-State LSTMs For Sequence-to-Sequence Learning
Xuefeng Bai1; Yafu Li1; Zhirui Zhang2; Mingzhou Xu3; Boxing Chen2; Weihua Luo2; Derek Wong3; Yue Zhang1,4
2021
Conference NameCCF International Conference on Natural Language Processing and Chinese Computing
Source PublicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13028 LNAI
Pages104-115
Conference DateOctober 13-17, 2021
Conference PlaceQingdao
CountryChina
PublisherSpringer Science and Business Media Deutschland GmbH
Abstract

Transformer is currently the dominant method for sequence to sequence problems. In contrast, RNNs have become less popular due to the lack of parallelization capabilities and the relatively lower performance. In this paper, we propose to use a parallelizable variant of bi-directional LSTMs (BiLSTMs), namely sentence-state LSTMs (S-LSTM), as an encoder for sequence-to-sequence tasks. The complexity of S-LSTM is only O(n) as compared to O(n) of Transformer. On four neural machine translation benchmarks, we empirically find that S-SLTM can achieve significantly better performances than BiLSTM and convolutional neural networks (CNNs). When compared to Transformer, our model gives competitive performance while being 1.6 times faster during inference.

KeywordBi-directional Lstms Cnn Neural Machine Translation Sentence-state Lstms Transformers
DOI10.1007/978-3-030-88480-2_9
URLView the original
Language英語English
Scopus ID2-s2.0-85118098345
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Affiliation1.School of Engineering, Westlake University, Hangzhou, China
2.Alibaba DAMO Academy, Hangzhou, China
3.NLP2CT Lab, Department of Computer and Information Science, University of Macau, Macao
4.Institute of Advanced Technology, Westlake Institute for Advanced Study, Hangzhou, China
Recommended Citation
GB/T 7714
Xuefeng Bai,Yafu Li,Zhirui Zhang,et al. Sentence-State LSTMs For Sequence-to-Sequence Learning[C]:Springer Science and Business Media Deutschland GmbH, 2021, 104-115.
APA Xuefeng Bai., Yafu Li., Zhirui Zhang., Mingzhou Xu., Boxing Chen., Weihua Luo., Derek Wong., & Yue Zhang (2021). Sentence-State LSTMs For Sequence-to-Sequence Learning. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 13028 LNAI, 104-115.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Xuefeng Bai]'s Articles
[Yafu Li]'s Articles
[Zhirui Zhang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Xuefeng Bai]'s Articles
[Yafu Li]'s Articles
[Zhirui Zhang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Xuefeng Bai]'s Articles
[Yafu Li]'s Articles
[Zhirui Zhang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.