Residential Collegefalse
Status已發表Published
Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation
Liang Ding1; Longyue Wang2; Xuebo Liu3; Derek F. Wong3; Dacheng Tao4; Zhaopeng Tu2
2021
Conference NameThe Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021)
Source PublicationACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference
Pages3431-3441
Conference Date08-2021
Conference PlaceVirtual
Publication PlaceUSA
PublisherASSOC COMPUTATIONAL LINGUISTICS-ACL
Abstract

Knowledge distillation (KD) is commonly used to construct synthetic data for training non-autoregressive translation (NAT) models. However, there exists a discrepancy on low-frequency words between the distilled and the original data, leading to more errors on predicting low-frequency words. To alleviate the problem, we directly expose the raw data into NAT by leveraging pretraining. By analyzing directed alignments, we found that KD makes low-frequency source words aligned with targets more deterministically but fails to align sufficient low-frequency words from target to source. Accordingly, we propose reverse KD to rejuvenate more alignments for low-frequency target words. To make the most of authentic and synthetic data, we combine these complementary approaches as a new training strategy for further boosting NAT performance. We conduct experiments on five translation benchmarks over two advanced architectures. Results demonstrate that the proposed approach can significantly and universally improve translation quality by reducing translation errors on low-frequency words. Encouragingly, our approach achieves 28.2 and 33.9 BLEU points on the WMT14 English-German and WMT16 Romanian-English datasets, respectively. Our code, data, and trained models are available at https://github.com/longyuewangdcu/RLFW-NAT.

DOI10.48550/arXiv.2106.00903
URLView the original
Language英語English
WOS Research AreaComputer Science ; Linguistics
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Interdisciplinary Applications ; Linguistics
WOS IDWOS:000698679200066
Scopus ID2-s2.0-85117843085
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorLiang Ding
Affiliation1.The University of Sydney, Australia
2.Tencent AI Lab,
3.University of Macau, Macao
4.JD Explore Academy, JD.com,
Recommended Citation
GB/T 7714
Liang Ding,Longyue Wang,Xuebo Liu,et al. Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation[C], USA:ASSOC COMPUTATIONAL LINGUISTICS-ACL, 2021, 3431-3441.
APA Liang Ding., Longyue Wang., Xuebo Liu., Derek F. Wong., Dacheng Tao., & Zhaopeng Tu (2021). Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation. ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference, 3431-3441.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Liang Ding]'s Articles
[Longyue Wang]'s Articles
[Xuebo Liu]'s Articles
Baidu academic
Similar articles in Baidu academic
[Liang Ding]'s Articles
[Longyue Wang]'s Articles
[Xuebo Liu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Liang Ding]'s Articles
[Longyue Wang]'s Articles
[Xuebo Liu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.