Stanford Seminar 2022 - Self Attention and Non-Parametric Transformers
Stanford University via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the origins and intuitions of Transformers, followed by an in-depth discussion on Non-Parametric Transformers (NPTs) in this Stanford seminar. Begin with a 15-minute overview of Transformer fundamentals by Aidan, a PhD student at Oxford and Cohere co-founder. Then, delve into the recently NeurIPs-accepted NPTs with Neil and Jannik, both PhD students at the University of Oxford. Gain insights into massive neural networks, Bayesian Deep Learning, Active Learning, and the application of non-parametric models with Transformers. Learn from these emerging researchers as they share their expertise in building and implementing advanced AI models.
Syllabus
CS25 I Stanford Seminar 2022 - Self Attention and Non-parametric transformers (NPTs)
Taught by
Stanford Online