Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Building a Chat Assistant with Canopy and Anyscale Endpoints

Anyscale via YouTube

Overview

Explore the challenges of building a chat assistant and discover how Canopy and Anyscale endpoints offer the fastest and easiest way to create RAG-based applications for free in this 45-minute webinar. Dive into the architecture, examine a real-life example, and follow a guide on getting started with building your own chat assistant. Learn about Canopy, a flexible framework built on top of the Pinecone vector database, which provides libraries and a simple API for chunking, embedding, chat history management, query optimization, and context retrieval. Gain insights into Anyscale Endpoints, a fast and performant LLM API for building AI-based applications, offering a serverless service for serving and fine-tuning open LLMs like Llama-2 and Mistral. Discover how Anyscale Endpoints now provides an embedding endpoint and allows fine-tuning of the largest Llama-2 70B model, giving you flexibility for open LLMs through an API.

Syllabus

Build a chat assistant fast using Canopy from Pinecone and Anyscale Endpoints

Taught by

Anyscale

Reviews

Start your review of Building a Chat Assistant with Canopy and Anyscale Endpoints

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.