期刊全称 | Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems | 影响因子2023 | Witold Pedrycz,Shyi-Ming Chen | 视频video | | 发行地址 | Comprehensive and up-to-date treatise of knowledge distillation cast in a general framework of transfer learning.Focuses on a spectrum of methodological and algorithmic issues.Includes recent developm | 学科分类 | Studies in Computational Intelligence | 图书封面 |  | 影响因子 | .The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.. | Pindex | Book 2023 |
The information of publication is updating
|
|