Completed
ColBERT approach to improve embeddings
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Embeddings vs Fine-Tuning - Part 1: Understanding and Implementing Embeddings
Automatically move to the next video in the Classroom when playback concludes
- 1 Should I use embeddings or fine-tuning?
- 2 How does semantic search work?
- 3 How to use embeddings with a language model?
- 4 The two keys to success with embeddings
- 5 How do cosine similarity and dot product similarity work?
- 6 How to embed a dataset? Touch Rugby Rules
- 7 How to prepare data for embedding?
- 8 Chunking a dataset for embeddings
- 9 What length of embeddings should I use?
- 10 Loading Llama 2 13B with GPTQ in Google Colab
- 11 Installing Llama 2 13B with GPTQ
- 12 Llama Performance without Embeddings
- 13 What embeddings should I use?
- 14 How to use OpenAI Embeddings
- 15 Using SBERT or "Marco" embeddings
- 16 How to create embeddings from data.
- 17 Calculating similarity using the dot product
- 18 Evaluating performance using embeddings
- 19 Using ChatGPT to Evaluate Performance of Embeddings
- 20 Llama 13-B Incorrect, GPT-4 Correct
- 21 Llama 13-B and GPT-4 Incorrect
- 22 Embeddings incorrect AND Llama 13B and GPT-4 Hallucinate
- 23 Summary of Embeddings Performance with Llama 2 and GPT-4
- 24 Pro tips for further improving performance with embeddings
- 25 ColBERT approach to improve embeddings
- 26 Top Tips for using Embeddings with Language Models