找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Joint Training for Neural Machine Translation; Yong Cheng Book 2019 Springer Nature Singapore Pte Ltd. 2019 Machine Translation.Neural Mac

[复制链接]
查看: 22454|回复: 42
发表于 2025-3-21 16:12:43 | 显示全部楼层 |阅读模式
书目名称Joint Training for Neural Machine Translation
编辑Yong Cheng
视频video
概述Nominated by Tsinghua University as an outstanding Ph.D. thesis.Reports on current challenges and important advances in neural machine translation.Addresses training jointly bidirectional neural machi
丛书名称Springer Theses
图书封面Titlebook: Joint Training for Neural Machine Translation;  Yong Cheng Book 2019 Springer Nature Singapore Pte Ltd. 2019 Machine Translation.Neural Mac
描述.This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models..
出版日期Book 2019
关键词Machine Translation; Neural Machine Translation; Joint Training; Joint Modeling; Bidirectional Model
版次1
doihttps://doi.org/10.1007/978-981-32-9748-7
isbn_ebook978-981-32-9748-7Series ISSN 2190-5053 Series E-ISSN 2190-5061
issn_series 2190-5053
copyrightSpringer Nature Singapore Pte Ltd. 2019
The information of publication is updating

书目名称Joint Training for Neural Machine Translation影响因子(影响力)




书目名称Joint Training for Neural Machine Translation影响因子(影响力)学科排名




书目名称Joint Training for Neural Machine Translation网络公开度




书目名称Joint Training for Neural Machine Translation网络公开度学科排名




书目名称Joint Training for Neural Machine Translation被引频次




书目名称Joint Training for Neural Machine Translation被引频次学科排名




书目名称Joint Training for Neural Machine Translation年度引用




书目名称Joint Training for Neural Machine Translation年度引用学科排名




书目名称Joint Training for Neural Machine Translation读者反馈




书目名称Joint Training for Neural Machine Translation读者反馈学科排名




单选投票, 共有 1 人参与投票
 

1票 100.00%

Perfect with Aesthetics

 

0票 0.00%

Better Implies Difficulty

 

0票 0.00%

Good and Satisfactory

 

0票 0.00%

Adverse Performance

 

0票 0.00%

Disdainful Garbage

您所在的用户组没有投票权限
发表于 2025-3-21 23:27:18 | 显示全部楼层
Agreement-Based Joint Training for Bidirectional Attention-Based Neural Machine Translation,n the same training data. Experiments on ChineseEnglish and English-French translation tasks show that agreement-based joint training significantly improves both alignment and translation quality over independent training.
发表于 2025-3-22 00:39:12 | 显示全部楼层
Semi-supervised Learning for Neural Machine Translation,et and target-to-source translation models serve as the encoder and decoder, respectively. Our approach can not only exploit the monolingual corpora of the target language, but also of the source language. Experiments on the ChineseEnglish dataset show that our approach achieves significant improvements over state-of-the-art SMT and NMT systems.
发表于 2025-3-22 06:50:26 | 显示全部楼层
发表于 2025-3-22 09:56:30 | 显示全部楼层
发表于 2025-3-22 15:36:46 | 显示全部楼层
发表于 2025-3-22 19:05:00 | 显示全部楼层
发表于 2025-3-22 23:00:42 | 显示全部楼层
Book 2019 of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to i
发表于 2025-3-23 01:35:18 | 显示全部楼层
发表于 2025-3-23 08:01:13 | 显示全部楼层
Related Work,T. Next we summarize a number of work which incorporate additional data resources, such as monolingual corpora and pivot language corpora, into machine translation systems. Finally, we make a simple review of the studies about contrastive learning, which is a key technique in our fourth work.
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-5-2 00:11
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表