Completed
- Fine tuning GPT3 for fiction
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Fine-Tune GPT-3 to Write an Entire Coherent Novel - Part 1
Automatically move to the next video in the Classroom when playback concludes
- 1 - Fine tuning GPT3 for fiction
- 2 - Generating fiction from a single story premise
- 3 - GPT3 copying a style verbatim
- 4 - Writing a novel: process, what works, and what doesn't
- 5 - Creating a fan fiction generator
- 6 - Generating fan fiction with a machine
- 7 - Training a machine to write a story
- 8 - Frankenstein and Alice in Wonderland Outlines
- 9 - Generating a story outline with Auto Muse
- 10 - The difference between working memory and recall
- 11 - The need for a task set in GPT3
- 12 - Generating a live summary of a story
- 13 - Preprocessing text data for GPT3
- 14 - Generating a novel with Auto Muse
- 15 - Writing a novel one paragraph at a time
- 16 - The increasing length of the story
- 17 - The process of creating a machine that can summarize a novel
- 18 - Building a story one chunk at a time
- 19 - The length of each summary chunk