Prompt Engineering - Understanding Large Language Models with ChatGPT
-
42
-
- Write review
Overview
Explore the inner workings of large language models and ChatGPT in this comprehensive 36-minute video tutorial on prompt engineering. Delve into the Transformer architecture, text embedding, and self-attention mechanisms. Learn about key deep learning methods, compare DALL-E and ChatGPT, and gain hands-on experience with Python code for multi-head self-attention and LLM parameter counting. Master the art of generating effective keywords and prompts to enhance your understanding of artificial intelligence and natural language processing technologies.
Syllabus
Content Intro
ChatGPT
Transformer architecture
Keywords Generation
Text embedding
Encoder and Decoder
Self attention
Multi-head self attention
PyTorch Code Multi-head self attention
Scaled Dot Product Attention
Key deep learning methods
Large language models
LLM Parameter Count Python Code
DALL-E large language model
Key Differences DALL-E & ChatGPT
List all Prompts
ChatGPT Session Summary
Taught by
Prodramp