Overview
Learn about advanced natural language processing concepts in this graduate-level lecture covering essential techniques for improving model performance through ensembling methods, model merging strategies, and sparse mixture of experts architectures. Explore how pipeline models work and their applications in NLP as Professor Graham Neubig from Carnegie Mellon University delves into these sophisticated approaches. Gain insights into combining multiple models effectively, understanding when and how to merge models, and implementing sparse expert systems for enhanced NLP capabilities. Part of CMU's Advanced NLP course series, this comprehensive lecture provides theoretical foundations and practical applications for developing more robust and efficient natural language processing systems.
Syllabus
CMU Advanced NLP Fall 2024 (14): Ensembling and Mixture of Experts
Taught by
Graham Neubig