Residential Collegefalse
Status已發表Published
Context-aware Self-Attention Networks for Natural Language Processing
Yang, Baosong1; Wang, Longyue2; Wong, Derek F.1; Shi, Shuming2; Tu, Zhaopeng2
2021-10-11
Source PublicationNeurocomputing
ISSN0925-2312
Volume458Pages:157-169
Abstract

Recently, Self-Attention Networks (SANs) have shown its flexibility in parallel computation and effectiveness of modeling both short- and long-term dependencies. However, SANs face two problems: 1) the weighted averaging inhibits relations among neighboring words (i.e., local context); and 2) it calculates dependencies between representations without considering contextual information (i.e., global context). Both local and global contexts have proven useful for modeling dependencies among neural representations in a variety of natural language processing tasks. Accordingly, we augment SANs with the ability of capturing usefully local and global context, and meanwhile maintain their simplicity and flexibility. Firstly, we cast local context modeling as a learnable Gaussian bias, which indicates the central and scope of the local region to be paid more attention. The bias is then incorporated into the original attention distribution to form a revised version. Secondly, we leverage the internal representations that embed sentence-level information as the global context. Specifically, we propose to contextualize the transformations of query and key layers, which are used to calculate the relevance between elements. Since the two approaches are potentially complementary to each other, we propose to combine them to further improve the performance. Empirical results on machine translation and linguistics probing tasks demonstrate the effectiveness and universality of the proposed approaches. Further analyses confirm that our approaches successfully capture contextual information as expected.

KeywordContext Modeling Inductive Bias Natural Language Processing Self-attention Networks
DOI10.1016/j.neucom.2021.06.009
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:000691559800013
PublisherELSEVIER, RADARWEG 29, 1043 NX AMSTERDAM, NETHERLANDS
Scopus ID2-s2.0-85108367543
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorTu, Zhaopeng
Affiliation1.University of Macau, China
2.Tencent AI Lab, China
First Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Yang, Baosong,Wang, Longyue,Wong, Derek F.,et al. Context-aware Self-Attention Networks for Natural Language Processing[J]. Neurocomputing, 2021, 458, 157-169.
APA Yang, Baosong., Wang, Longyue., Wong, Derek F.., Shi, Shuming., & Tu, Zhaopeng (2021). Context-aware Self-Attention Networks for Natural Language Processing. Neurocomputing, 458, 157-169.
MLA Yang, Baosong,et al."Context-aware Self-Attention Networks for Natural Language Processing".Neurocomputing 458(2021):157-169.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yang, Baosong]'s Articles
[Wang, Longyue]'s Articles
[Wong, Derek F.]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yang, Baosong]'s Articles
[Wang, Longyue]'s Articles
[Wong, Derek F.]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yang, Baosong]'s Articles
[Wang, Longyue]'s Articles
[Wong, Derek F.]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.