书目名称 | Foundation Models for Natural Language Processing |
副标题 | Pre-trained Language |
编辑 | Gerhard Paaß,Sven Giesselbach |
视频video | |
概述 | Offers an overview of pre-trained language models such as BERT, GPT, and sequence-to-sequence Transformer.Explains the key techniques to improve the performance of pre-trained models.Presents advanced |
丛书名称 | Artificial Intelligence: Foundations, Theory, and Algorithms |
图书封面 |  |
描述 | .This open access book provides a comprehensive overview of the state of the art in research and applications of Foundation Models and is intended for readers familiar with basic Natural Language Processing (NLP) concepts. .Over the recent years, a revolutionary new paradigm has been developed for training models for NLP. These models are first pre-trained on large collections of text documents to acquire general syntactic knowledge and semantic information. Then, they are fine-tuned for specific tasks, which they can often solve with superhuman accuracy. When the models are large enough, they can be instructed by prompts to solve new tasks without any fine-tuning. Moreover, they can be applied to a wide range of different media and problem domains, ranging from image and video processing to robot control learning. Because they provide a blueprint for solving many tasks in artificial intelligence, they have been called Foundation Models. .After a brief introduction to basic NLP models the main pre-trained language models BERT, GPT and sequence-to-sequence transformer are described, as well as the concepts of self-attention and context-sensitive embedding. Then, different approaches |
出版日期 | Book‘‘‘‘‘‘‘‘ 2023 |
关键词 | Pre-trained Language Models; Deep Learning; Natural Language Processing; Transformer Models; BERT; GPT; At |
版次 | 1 |
doi | https://doi.org/10.1007/978-3-031-23190-2 |
isbn_softcover | 978-3-031-23192-6 |
isbn_ebook | 978-3-031-23190-2Series ISSN 2365-3051 Series E-ISSN 2365-306X |
issn_series | 2365-3051 |
copyright | The Editor(s) (if applicable) and The Author(s) 2023 |