Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Fine-Tune GPT-3 to Write an Entire Coherent Novel - Part 2

David Shapiro ~ AI via YouTube

Overview

Explore advanced techniques for fine-tuning GPT-3 to generate entire novels in this comprehensive video tutorial. Learn about Auto Muse for generating books through successive chunks, creating book summaries, and iteratively condensing text. Discover strategies for dealing with information decay, shortening outlines, and reducing word counts using GPT-3. Delve into practical examples involving classic literature like Pride and Prejudice, The Great Gatsby, and Sherlock Holmes. Master the challenges of working within character limits and fine-tuning models for specific genres like fan fiction.

Syllabus

- Auto Muse: Generating a book through successive chunks
- Writing the next chunk of a novel
- Generating book summaries
- Summarizing the summaries
- The time decay of information
- Shortening the Outlines
- Using GPT3 to reduce the word count of a passage
- Iteratively making a text more concise
- The futility of the human condition
- Summarizing Pride and Prejudice
- Solving the case of the extra spaces
- Using GPT3 to generate summaries
- Training GPT3 to write a novel
- The limit of 6000 characters for a gpt3 prompt
- Trying to make a super concise summary
- The Great Gatsby
- The Adventures of Sherlock Holmes
- Fine-tuning a GPT-3 model for Sherlock Holmes fan fiction

Taught by

David Shapiro ~ AI

Reviews

Start your review of Fine-Tune GPT-3 to Write an Entire Coherent Novel - Part 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.