UM

Browse/Search Results:  1-10 of 14 Help

Selected(0)Clear Items/Page:    Sort:
A Paradigm Shift: The Future of Machine Translation Lies with Large Language Models Conference paper
Lyu, Chenyang, Du, Zefeng, Xu, Jitao, Duan, Yitao, Wu, Minghao, Lynn, Teresa, Aji, Alham Fikri, Wong, Derek F., Liu, Siyou, Wang, Longyue. A Paradigm Shift: The Future of Machine Translation Lies with Large Language Models[C]:European Language Resources Association (ELRA), 2024, 1339-1352.
Authors:  Lyu, Chenyang;  Du, Zefeng;  Xu, Jitao;  Duan, Yitao;  Wu, Minghao; et al.
Favorite | TC[Scopus]:1 | Submit date:2024/07/04
Large Language Models  Machine Translation  New Trends  
Findings of the WMT 2023 Shared Task on Discourse-Level Literary Translation: A Fresh Orb in the Cosmos of LLMs Conference paper
Wang, Longyue, Tu, Zhaopeng, Gu, Yan, Liu, Siyou, Yu, Dian, Ma, Qingsong, Lyu, Chenyang, Zhou, Liting, Liu, Chao Hong, Ma, Yufeng, Chen, Weiyu, Graham, Yvette, Webber, Bonnie, Koehn, Philipp, Way, Andy, Yuan, Yulin, Shi, Shuming. Findings of the WMT 2023 Shared Task on Discourse-Level Literary Translation: A Fresh Orb in the Cosmos of LLMs[C]. Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz:Association for Computational Linguistics, 2023, 55-67.
Authors:  Wang, Longyue;  Tu, Zhaopeng;  Gu, Yan;  Liu, Siyou;  Yu, Dian; et al.
Favorite | TC[Scopus]:5 | Submit date:2024/02/22
A Benchmark Dataset and Evaluation Methodology for Chinese Zero Pronoun Translation Journal article
Xu,Mingzhou, Wang,Longyue, Liu,Siyou, Wong,Derek F., Shi,Shuming, Tu,Zhaopeng. A Benchmark Dataset and Evaluation Methodology for Chinese Zero Pronoun Translation[J]. Language Resources and Evaluation, 2023, 57, 1263–1293.
Authors:  Xu,Mingzhou;  Wang,Longyue;  Liu,Siyou;  Wong,Derek F.;  Shi,Shuming; et al.
Favorite | TC[WOS]:1 TC[Scopus]:2  IF:1.7/2.0 | Submit date:2023/08/03
Benchmark Dataset  Discourse  Evaluation Metric  Machine Translation  Zero Pronoun  
Recent Advances in Dialogue Machine Translation Journal article
Liu, Siyou, Sun, Yuqi, Wang, Longyue. Recent Advances in Dialogue Machine Translation[J]. Information (Switzerland), 2021, 12(11), 484.
Authors:  Liu, Siyou;  Sun, Yuqi;  Wang, Longyue
Adobe PDF | Favorite | TC[WOS]:2 TC[Scopus]:6  IF:2.4/2.6 | Submit date:2022/08/28
Dialogue  Neural Machine Translation  Discourse Issue  Benchmark Data  Existing Approaches  Real-life Applications  Building Advanced System  
Context-aware Self-Attention Networks for Natural Language Processing Journal article
Yang, Baosong, Wang, Longyue, Wong, Derek F., Shi, Shuming, Tu, Zhaopeng. Context-aware Self-Attention Networks for Natural Language Processing[J]. Neurocomputing, 2021, 458, 157-169.
Authors:  Yang, Baosong;  Wang, Longyue;  Wong, Derek F.;  Shi, Shuming;  Tu, Zhaopeng
Favorite | TC[WOS]:32 TC[Scopus]:39  IF:5.5/5.5 | Submit date:2021/12/08
Context Modeling  Inductive Bias  Natural Language Processing  Self-attention Networks  
On the Copying Behaviors of Pre-Training for Neural Machine Translation Conference paper
Liu, Xuebo, Wang, Longyue, Wong, Derek F., Ding, Liang, Chao, Lidia S., Shi, Shuming, Tu, Zhaopeng. On the Copying Behaviors of Pre-Training for Neural Machine Translation[C]. Zong C., Xia F., Li W., Navigli R.:Association for Computational Linguistics (ACL), 2021, 4265-4275.
Authors:  Liu, Xuebo;  Wang, Longyue;  Wong, Derek F.;  Ding, Liang;  Chao, Lidia S.; et al.
Favorite | TC[Scopus]:23 | Submit date:2022/05/13
On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation Conference paper
Liu, Xuebo, Wang, Longyue, Wong, Derek F., Ding, Liang, Chao, Lidia S., Shi, Shuming, Tu, Zhaopeng. On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation[C]. Moens M.-F., Huang X., Specia L., Yih S.W.-T.:Association for Computational Linguistics (ACL), 2021, 2900-2907.
Authors:  Liu, Xuebo;  Wang, Longyue;  Wong, Derek F.;  Ding, Liang;  Chao, Lidia S.; et al.
Favorite | TC[Scopus]:22 | Submit date:2022/05/13
Progressive Multi-Granularity Training for Non-Autoregressive Translation Conference paper
Ding, Liang, Wang, Longyue, Liu, Xuebo, Wong, Derek F., Tao, Dacheng, Tu, Zhaopeng. Progressive Multi-Granularity Training for Non-Autoregressive Translation[C], 2021, 2797-2803.
Authors:  Ding, Liang;  Wang, Longyue;  Liu, Xuebo;  Wong, Derek F.;  Tao, Dacheng; et al.
Favorite | TC[Scopus]:29 | Submit date:2022/05/13
Convolutional self-attention networks Conference paper
Yang,Baosong, Wang,Longyue, Wong,Derek F., Chao,Lidia S., Tu,Zhaopeng. Convolutional self-attention networks[C], 2019, 4040-4045.
Authors:  Yang,Baosong;  Wang,Longyue;  Wong,Derek F.;  Chao,Lidia S.;  Tu,Zhaopeng
Favorite | TC[WOS]:51 TC[Scopus]:88 | Submit date:2021/03/11
Modeling recurrence for transformer Conference paper
Hao, Jie, Wang, Xing, Yang, Baosong, Wang, Longyue, Zhang, Jinfeng, Tu, Zhaopeng. Modeling recurrence for transformer[C], 2019, 1198-1207.
Authors:  Hao, Jie;  Wang, Xing;  Yang, Baosong;  Wang, Longyue;  Zhang, Jinfeng; et al.
Favorite | TC[WOS]:29 TC[Scopus]:48 | Submit date:2022/05/23