找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Machine Translation; 18th China Conferenc Tong Xiao,Juan Pino Conference proceedings 2022 The Editor(s) (if applicable) and The Author(s),

[复制链接]
楼主: invigorating
发表于 2025-3-23 12:07:39 | 显示全部楼层
Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation,just the amount of information input at any time by way of curriculum learning. The fine-tuning and inference phases disable the module in the same way as the normal NAT model. In this paper, we experiment on two translation datasets of WMT16, and the BLEU improvement reaches 4.4 without speed reduction.
发表于 2025-3-23 14:03:44 | 显示全部楼层
发表于 2025-3-23 20:11:42 | 显示全部楼层
: Post-editing Advancement Cookbook,t progress since 2015; however, whether APE models are really performing well on domain samples remains as an open question, and achieving this is still a hard task. This paper provides a mobile domain APE corpus with 50.1 TER/37.4 BLEU for the En-Zh language pair. This corpus is much more practical
发表于 2025-3-23 23:45:23 | 显示全部楼层
Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Mac is very important, and the use of pre-training model can also alleviate the shortage of data. However, the good performance of common cold-start transfer learning methods is limited to the cognate language realized by sharing its vocabulary. Moreover, when using the pre-training model, the combinat
发表于 2025-3-24 03:18:30 | 显示全部楼层
Review-Based Curriculum Learning for Neural Machine Translation,om simple to difficult to adapt the general NMT model to a specific domain. However, previous curriculum learning methods suffer from catastrophic forgetting and learning inefficiency. In this paper, we introduce a review-based curriculum learning method, targetedly selecting curriculum according to
发表于 2025-3-24 09:15:29 | 显示全部楼层
,Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages,an.Chinese Daily Conversation Translation, Tibetan.Chinese Government Document Translation, and Uighur.Chinese News Translation. We train our models using the Deep Transformer architecture, and adopt enhancement strategies such as Regularized Dropout, Tagged Back-Translation, Alternated Training, an
发表于 2025-3-24 12:10:41 | 显示全部楼层
,Target-Side Language Model for Reference-Free Machine Translation Evaluation,evaluation, where source texts are directly compared with system translations. In this paper, we design a reference-free metric that is based only on a target-side language model for segment-level and system-level machine translation evaluations respectively, and it is found out that promising resul
发表于 2025-3-24 16:31:03 | 显示全部楼层
,Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset selection. The multilingual pre-trained model is designed to increase the performance of machine translation with low resources by bringing in more common information. Instead of repeatedly training several checkpoints from scratch, this study proposes a checkpoint selection strategy that uses a cl
发表于 2025-3-24 21:48:57 | 显示全部楼层
,Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples, added to the input sentence, the model will produce completely different translation with high confidence. Adversarial example is currently a major tool to improve model robustness and how to generate an adversarial examples that can degrade the performance of the model and ensure semantic consiste
发表于 2025-3-25 00:39:33 | 显示全部楼层
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-6-28 01:11
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表