Overview
Learn how to fine-tune an OpenAI GPT-3.5-Turbo LLM model using enterprise web data in this hands-on Python workshop. Explore the importance of fine-tuning LLM models with ad-hoc data and discover how to leverage open-source libraries like Langchain and ChromaDB. Follow along as the instructor demonstrates setting up the OpenAI API key, configuring embeddings, creating a knowledge base, and implementing a ChatGPT-like interface for querying customer web data. Gain practical insights into working with LLM application frameworks, embedding databases, and continuous querying techniques. By the end of this 35-minute tutorial, you'll have the skills to create a customized language model tailored to your specific enterprise data needs.
Syllabus
Content Intro
The Problem
The Solution
Working Solution Demo
Understanding Solution
Open-source libs
Web Data as source content
Testing UI without action
Python Libs Installed
Python Coding Starts
Setting OpenAI API Key
Setting Embeddings
Setting Chunk Splitter
Setting embeddings Model
Create & Persist Embeddings
Test Embeddings Code
Setting Langchain App
Adding Query to KB DB
Testing Query with KB
Continuous Queries
Opensource Libs Review
OpenAI Billing
Source code
Recap
Taught by
Prodramp