找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Natural Language Processing and Chinese Computing; 13th National CCF Co Derek F. Wong,Zhongyu Wei,Muyun Yang Conference proceedings 2025 Th

[复制链接]
楼主: 使委屈
发表于 2025-3-27 01:02:00 | 显示全部楼层
发表于 2025-3-27 02:07:23 | 显示全部楼层
What is the Best Model? Application-Driven Evaluation for Large Language Models and industry as they generalize foundation models to various practical tasks in a prompt manner. To assist users in selecting the best model in practical application scenarios, i.e., choosing the model that meets the application requirements while minimizing cost, we introduce A-Eval, an applicatio
发表于 2025-3-27 09:15:38 | 显示全部楼层
Sparse Mixture of Experts Language Models Excel in Knowledge Distillationn distilling large language models have primarily focused on loss functions and training methodologies, with limited attention given to structural improvements of student models. This is largely due to the challenges posed by cross-architecture distillation and the substantial computational resource
发表于 2025-3-27 10:43:19 | 显示全部楼层
发表于 2025-3-27 14:47:48 | 显示全部楼层
Reparameterization-Based Parameter-Efficient Fine-Tuning Methods for Large Language Models: A Systemning objectives to achieve unprecedented performance. To fully exploit the potential of LLMs, fine-tuning LLMs on specific downstream tasks is essential. However, traditional full fine-tuning methods pose significant computational challenges, prompting the emergence of Parameter-Efficient Fine-Tunin
发表于 2025-3-27 21:28:38 | 显示全部楼层
发表于 2025-3-27 22:21:50 | 显示全部楼层
发表于 2025-3-28 04:54:57 | 显示全部楼层
发表于 2025-3-28 06:50:27 | 显示全部楼层
FIRP: Faster LLM Inference via Future Intermediate Representation Predictionnature of LLM decoding, which generates only a single token per forward propagation, fails to fully exploit the parallel computational power of GPUs, leading to considerable latency. To address this, we introduce a novel speculative decoding method named FIRP which generates multiple tokens instead
发表于 2025-3-28 11:48:37 | 显示全部楼层
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-5-3 05:01
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表