Overview
Discover the origins of HuggingFace and the Transformers library in this insightful 43-minute talk by Thomas Wolf, co-founder and Chief Science Officer of HuggingFace. Explore the concept of transfer learning in Natural Language Processing (NLP) and its significance in the field. Learn about open-source models, modern architectures, and the importance of data sets in AI development. Gain insights into community-driven initiatives, the Data Hub, and the Open Science Big Science project. Delve into scaling challenges, reproduction issues, and the role of supercomputers in AI research. Understand the impact of Big Science papers and get a glimpse of future developments in the field. Engage with thought-provoking questions and answers to deepen your understanding of HuggingFace's contributions to the AI and NLP landscape.
Syllabus
Introduction
The beginning of HackingFace
GPT
Why Transfer Learning in NLP
Open Source Models
Birds in Python
Python Pretrained Bird
Transformers
Modern Architecture
Data Sets
Community Event
Data Hub
Open Science Big Science
Scaling
Reproduction
Problems
Supercomputers
Big Science Papers
Whats next
Questions
Taught by
Hugging Face