Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to fine-tune a local Mistral 7B model step-by-step in this comprehensive tutorial video. Follow along as the instructor demonstrates the entire process, from creating a self-generated dataset to testing the final fine-tuned model. Discover techniques for dataset creation, conversion to JSONL format, uploading, initiating the fine-tuning process, downloading the model, converting it to .gguf format, and conducting tests. Gain insights into using GitHub resources, llama.cpp, and other tools to enhance your understanding of local language model fine-tuning.
Syllabus
Local Fine Tune Intro
Flowchart
Create Mistral 7B Dataset
Check Dataset
Upload Dataset
Start fine-tune job
Convert model to gguf
Testing our fine tuned model
Conclusion
Taught by
All About AI