Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

GPT-NeoX-20B - Open-Source Huge Language Model by EleutherAI - Interview With Co-Founder Connor Leahy

Yannic Kilcher via YouTube

Overview

Explore the development and capabilities of GPT-NeoX-20B, a 20 billion parameter open-source language model, in this insightful interview with EleutherAI co-founder Connor Leahy. Discover the process of training, hardware acquisition, and model performance. Learn about the differences between GPT-Neo, GPT-J, and GPT-NeoX, and gain insights into the challenges of training large language models. Find out how to try the model yourself using GooseAI and hear final thoughts on the project's impact and future potential.

Syllabus

- Intro
- Start of interview
- How did you get all the hardware?
- What's the scale of this model?
- A look into the experimental results
- Why are there GPT-Neo, GPT-J, and GPT-NeoX?
- How difficult is training these big models?
- Try out the model on GooseAI
- Final thoughts

Taught by

Yannic Kilcher

Reviews

Start your review of GPT-NeoX-20B - Open-Source Huge Language Model by EleutherAI - Interview With Co-Founder Connor Leahy

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.