Residential Collegefalse
Status已發表Published
Convolutional self-attention networks
Yang,Baosong1; Wang,Longyue2; Wong,Derek F.1; Chao,Lidia S.1; Tu,Zhaopeng2
2019
Conference Name2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL HLT 2019
Source PublicationNAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference
Volume1
Pages4040-4045
Conference Date2 June 2019through 7 June 2019
Conference PlaceMinneapolis
Abstract

Self-attention networks (SANs) have drawn increasing interest due to their high parallelization in computation and flexibility in modeling dependencies. SANs can be further enhanced with multi-head attention by allowing the model to attend to information from different representation subspaces. In this work, we propose novel convolutional self-attention networks, which offer SANs the abilities to 1) strengthen dependencies among neighboring elements, and 2) model the interaction between features extracted by multiple attention heads. Experimental results of machine translation on different language pairs and model settings show that our approach outperforms both the strong Transformer baseline and other existing models on enhancing the locality of SANs. Comparing with prior studies, the proposed model is parameter free in terms of introducing no more parameters.

URLView the original
Language英語English
WOS IDWOS:000900116904019
Scopus ID2-s2.0-85085557354
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Affiliation1.NLP2CT Lab,Department of Computer and Information Science,University of Macau,Macao
2.Tencent AI Lab,
First Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Yang,Baosong,Wang,Longyue,Wong,Derek F.,et al. Convolutional self-attention networks[C], 2019, 4040-4045.
APA Yang,Baosong., Wang,Longyue., Wong,Derek F.., Chao,Lidia S.., & Tu,Zhaopeng (2019). Convolutional self-attention networks. NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference, 1, 4040-4045.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yang,Baosong]'s Articles
[Wang,Longyue]'s Articles
[Wong,Derek F.]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yang,Baosong]'s Articles
[Wang,Longyue]'s Articles
[Wong,Derek F.]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yang,Baosong]'s Articles
[Wang,Longyue]'s Articles
[Wong,Derek F.]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.