找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Machine Translation; 18th China Conferenc Tong Xiao,Juan Pino Conference proceedings 2022 The Editor(s) (if applicable) and The Author(s),

[复制链接]
楼主: invigorating
发表于 2025-3-25 04:42:55 | 显示全部楼层
发表于 2025-3-25 08:39:29 | 显示全部楼层
发表于 2025-3-25 15:24:12 | 显示全部楼层
发表于 2025-3-25 18:49:51 | 显示全部楼层
,Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation,xplore the experiment settings (including the number of BPE merge operations, dropout probability, embedding size, etc.) for the low-resource scenario with the 6-layer Transformer. Considering that increasing the number of layers also increases the regularization on new model parameters (dropout mod
发表于 2025-3-25 22:27:43 | 显示全部楼层
CCMT 2022 Translation Quality Estimation Task,fort estimation in the 18th China Conference on Machine Translation (CCMT) 2022. This method is based on a predictor-estimator model. The predictor is an XLM-RoBERTa model pre-trained on a large-scale parallel corpus and extracts features from the source language text and machine-translated text. Th
发表于 2025-3-26 03:25:06 | 显示全部楼层
,Effective Data Augmentation Methods for CCMT 2022,slation (CCMT 2022) evaluation tasks. We submitted the results of two bilingual machine translation (MT) evaluation tasks in CCMT 2022. One is Chinese-English bilingual MT tasks from the news field, the other is Chinese-Thai bilingual MT tasks in low resource languages. Our system is based on Transf
发表于 2025-3-26 08:11:43 | 显示全部楼层
,NJUNLP’s Submission for CCMT 2022 Quality Estimation Task, CCMT 2022 quality estimation sentence-level task for English-to-Chinese (EN-ZH). We follow the DirectQE framework, whose target is bridging the gap between pre-training on parallel data and fine-tuning on QE data. We further combine DirectQE with the pre-trained language model XLM-RoBERTa (XLM-R) w
发表于 2025-3-26 11:48:47 | 显示全部楼层
,ISTIC’s Thai-to-Chinese Neural Machine Translation System for CCMT’ 2022,hina (ISTIC) for the 18th China Conference on Machine Translation (CCMT’ 2022). ISTIC participated in a low resource evaluation task: Thai-to-Chinese MT task. The paper mainly illuminates its system framework based on Transformer, data preprocessing methods and some strategies adopted in this system
发表于 2025-3-26 13:27:11 | 显示全部楼层
Pengcong Wang,Hongxu Hou,Shuo Sun,Nier Wu,Weichen Jian,Zongheng Yang,Yisong Wang‘ several years of experience in teaching linear models at various levels. It gives an up-to-date account of the theory and applications of linear models. The book can be used as a text for courses in statistics at the graduate level and as an accompanying text for courses in other areas. Some of th
发表于 2025-3-26 18:29:23 | 显示全部楼层
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-6-28 01:11
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表