唤起 发表于 2025-3-25 04:04:32
Semantic Perception-Oriented Low-Resource Neural Machine Translation,aining methods (BERT) uses attention mechanism based on Levenshtein distance (LD) to extract language features, which ignored syntax-related information. In this paper, we proposed a machine translation pre-training method with semantic perception which depend on the traditional position-based model喊叫 发表于 2025-3-25 10:57:09
Semantic-Aware Deep Neural Attention Network for Machine Translation Detection,of data collected comes from machine-translated texts rather than native speakers or professional translators, severely reducing the benefit of data scale. Traditional machine translation detection methods generally require human-crafted feature engineering and are difficult to distinguish the fine-AMEND 发表于 2025-3-25 11:57:50
Routing Based Context Selection for Document-Level Neural Machine Translation,encoding. Usually, the sentence-level representation is incorporated (via attention or gate mechanism) in these methods, which makes them straightforward but rough, and it is difficult to distinguish useful contextual information from noises. Furthermore, the longer the encoding length is, the moreMusculoskeletal 发表于 2025-3-25 19:17:46
Generating Diverse Back-Translations via Constraint Random Decoding,erformance of Neural Machine Translation (NMT), especially in low-resource scenarios. Previous researches show that diversity of the synthetic source sentences is essential for back-translation. However, the frequently used random methods such as sampling or noised beam search, although can output dAdditive 发表于 2025-3-25 22:57:54
,ISTIC’s Neural Machine Translation System for CCMT’ 2021,chnical Information of China (ISTIC) for the 17th China Conference on Machine Translation (CCMT’ 2021). ISTIC participated in the following four machine translation (MT) evaluation tasks: MT task of Mongolian-to-Chinese daily expressions, MT task of Tibetan-to-Chinese government documents, MT task oAnticoagulant 发表于 2025-3-26 00:27:53
http://reply.papertrans.cn/63/6208/620772/620772_26.png和平主义者 发表于 2025-3-26 06:14:22
http://reply.papertrans.cn/63/6208/620772/620772_27.pngmaladorit 发表于 2025-3-26 10:58:51
http://reply.papertrans.cn/63/6208/620772/620772_28.pngIntrepid 发表于 2025-3-26 16:01:15
1865-0929 ober 2021. .The 10 papers presented in this volume were carefully reviewed and selected from 25 submissions and focus on all aspects of machine translation, including preprocessing, neural machine translation models, hybrid model, evaluation method, and post-editing..978-981-16-7511-9978-981-16-7512-6Series ISSN 1865-0929 Series E-ISSN 1865-0937INCUR 发表于 2025-3-26 20:37:44
978-981-16-7511-9Springer Nature Singapore Pte Ltd. 2021