Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Using MPT-7B in Hugging Face and LangChain

James Briggs via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the implementation of Mosaic ML's new MPT-7B language model in Hugging Face transformers and LangChain. Learn how to utilize various MPT-7B models, including instruct, chat, and storywriter-65k versions, while gaining access to powerful tooling such as AI agents and chatbot functionality. Follow along with Python setup, model initialization, tokenizer configuration, and text generation processes. Discover the potential of open-source LLMs and their integration with popular NLP libraries for advanced natural language processing tasks.

Syllabus

Open Source LLMs like MPT-7B
MPT-7B Models in Hugging Face
Python setup
Initializing MPT-7B-Instruct
Initializing the MPT-7B tokenizer
Stopping Criteria and HF Pipeline
Hugging Face Pipeline
Generating Text with Hugging Face
Implementing MPT-7B in LangChain
Final Thoughts on Open Source LLMs

Taught by

James Briggs

Reviews

Start your review of Using MPT-7B in Hugging Face and LangChain

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.