Coeval 发表于 2025-3-26 22:19:31

http://reply.papertrans.cn/83/8274/827394/827394_31.png

原来 发表于 2025-3-27 03:33:37

http://reply.papertrans.cn/83/8274/827394/827394_32.png

树上结蜜糖 发表于 2025-3-27 06:57:23

Robust Representation Learning,ies different robustness needs and characterizes important robustness problems in NLP representation learning, including backdoor robustness, adversarial robustness, out-of-distribution robustness, and interpretability. We also discuss current solutions and future directions for each problem.

stress-response 发表于 2025-3-27 11:52:25

Biomedical Knowledge Representation Learning,cal research. In this chapter, with biomedical knowledge as the core, we launch a discussion on knowledge representation and acquisition as well as biomedical knowledge-guided NLP tasks and explain them in detail with practical scenarios. We also discuss current research progress and several future directions.

松果 发表于 2025-3-27 15:49:07

Pre-trained Models for Representation Learning,ng, from pre-training tasks to adaptation approaches for specific tasks. After that, we discuss several advanced topics toward better pre-trained representations, including better model architecture, multilingual, multi-task, efficient representations, and chain-of-thought reasoning.

UTTER 发表于 2025-3-27 18:33:44

OpenBMB: Big Model Systems for Large-Scale Representation Learning, computation and expertise of big model applications. In this chapter, we will introduce the core toolkits in OpenBMB, including BMTrain for efficient training, OpenPrompt and OpenDelta for efficient tuning, BMCook for efficient compression, and BMInf for efficient inference.

wall-stress 发表于 2025-3-27 23:34:09

http://reply.papertrans.cn/83/8274/827394/827394_37.png

忘川河 发表于 2025-3-28 03:50:15

http://reply.papertrans.cn/83/8274/827394/827394_38.png

清醒 发表于 2025-3-28 07:00:47

http://reply.papertrans.cn/83/8274/827394/827394_39.png

鬼魂 发表于 2025-3-28 14:28:23

http://reply.papertrans.cn/83/8274/827394/827394_40.png
页: 1 2 3 [4] 5
查看完整版本: Titlebook: Representation Learning for Natural Language Processing; Zhiyuan Liu,Yankai Lin,Maosong Sun Book‘‘‘‘‘‘‘‘ 2023Latest edition The Editor(s)