Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore effective transfer learning techniques for Natural Language Processing in this 44-minute conference talk from ODSC East 2018. Dive into the challenges and opportunities of applying transfer learning to NLP tasks, comparing its success in computer vision to its limited gains in language processing. Learn about innovative approaches using sequence representations instead of fixed-length document vectors, and discover how these methods can improve performance on real-world tasks with limited training data. Gain insights into parameter and data-efficient mechanisms for transfer learning, and get introduced to Enso, an open-source library for benchmarking transfer learning methods. Understand practical recommendations for implementing transfer learning in NLP, including the importance of quality embeddings, source tasks, and feature engineering. Examine the workflow, visualization, and documentation aspects of machine learning research, and explore cutting-edge concepts like deep contextualized word representations.
Syllabus
Introduction
Training Data Requirements
Representation Learning
Word Tyvek
Simple Vector Arithmetic
Deep Learning
Deep Learning Problems
Small Data Problems
Transfer Learning
Transfer Learning Diagram
Practical Recommendations
Quality of Embedding
Source Tasks
Logistic Regression
Second Order Optimization
Measuring Variance
Class Balance
Feature Engineering
Enzo
Workflow
Visualization
Documentation
Machine Learning Research
Good Papers
Deep contextualized word representations
Source model
Average Representations
Data Problems
Questions
Taught by
Open Data Science