找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Building Generative AI-Powered Apps; A Hands-on Guide for Aarushi Kansal Book 2024 Aarushi Kansal 2024 Artificial Intelligence.Generative A

[复制链接]
楼主: estradiol
发表于 2025-3-25 03:21:48 | 显示全部楼层
Introduction to Generative AI,and at some point, a manager is probably going to ask you “can we do generative AI too?” or you’re going to get tempted and hack together an LLM-powered bot at 2 a.m. This chapter introduces you, a software engineer, to the booming world of AI, by cutting through all the hype and demystifying AI. I
发表于 2025-3-25 08:32:31 | 显示全部楼层
LangChain: Your Swiss Army Knife,apter introduces you to LangChain, your Swiss Army knife to building robust applications on top of LLMs and other models. As you build applications beyond just making API calls, you’re going to need various components to connect a model to your own data, to external data, and services, and that’s wh
发表于 2025-3-25 13:04:47 | 显示全部楼层
Chains, Tools and Agents,bot that answered your questions . could remember the rest of your conversation. This allowed the LLM to become “smarter” by getting context from history. Your chatbot also had access to up-to-date, personal information via a vector database, meaning it was able to answer questions beyond what it wa
发表于 2025-3-25 15:58:52 | 显示全部楼层
Guardrails and AI: Building Safe + Controllable Apps, your day for you. This agent was able to reason and have access to “the world” via API integrations (the so-called tools). This was a fairly simple application, but it was still autonomous . and when AI is autonomous, there’s always space for things to go wrong if proper safeguards are not in place
发表于 2025-3-25 20:44:36 | 显示全部楼层
Finetuning: The Theory,rdrails around ensuring your LLM stays on topic, executes the right flow, and is able to block users. You looked into NeMo and understood how it combines LLMs, Colang, and embedding models to create a generalized set of rules, based on natural language rules you give it.
发表于 2025-3-26 03:27:45 | 显示全部楼层
Finetuning: Hands on,n models. You learned about the whys, whats, and hows of fine-tuning. You learned that fine-tuning can be less resource and time consuming than building and training a model from scratch. The previous chapter talked to you about what happens to the neural network during the fine-tuning process . spe
发表于 2025-3-26 06:05:39 | 显示全部楼层
发表于 2025-3-26 11:32:28 | 显示全部楼层
发表于 2025-3-26 14:16:42 | 显示全部楼层
发表于 2025-3-26 18:16:49 | 显示全部楼层
https://doi.org/10.1007/978-3-540-24785-2ory. Your chatbot also had access to up-to-date, personal information via a vector database, meaning it was able to answer questions beyond what it was trained on. This also helped prevent hallucination.
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-6-14 13:09
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表