Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Stanford University

Stanford Seminar 2022 - Self Attention and Non-Parametric Transformers

Stanford University via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the origins and intuitions of Transformers, followed by an in-depth discussion on Non-Parametric Transformers (NPTs) in this Stanford seminar. Begin with a 15-minute overview of Transformer fundamentals by Aidan, a PhD student at Oxford and Cohere co-founder. Then, delve into the recently NeurIPs-accepted NPTs with Neil and Jannik, both PhD students at the University of Oxford. Gain insights into massive neural networks, Bayesian Deep Learning, Active Learning, and the application of non-parametric models with Transformers. Learn from these emerging researchers as they share their expertise in building and implementing advanced AI models.

Syllabus

CS25 I Stanford Seminar 2022 - Self Attention and Non-parametric transformers (NPTs)

Taught by

Stanford Online

Reviews

Start your review of Stanford Seminar 2022 - Self Attention and Non-Parametric Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.