ProxyLM - Predicting Language Model Performance on Multilingual Tasks via Proxy Models
Toronto Machine Learning Series (TMLS) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Watch a 29-minute conference talk from the Toronto Machine Learning Series where researchers present ProxyLM, a scalable framework for predicting Language Model performance using proxy models in multilingual tasks. Learn how this innovative approach, presented by professors and students from OntarioTech University and the University of Toronto, achieves significant computational efficiency by using proxy models as surrogates to approximate language model performance. Discover how the framework achieves up to 37.08× speedup in task evaluations compared to traditional methods and outperforms state-of-the-art performance by 1.89× for previously unseen languages, as measured by root-mean-square error. Understand how this methodology streamlines model selection and enables efficient deployment of language models without requiring extensive computational resources.
Syllabus
ProxyLM: Predicting Language Model Performance on Multilingual Tasks via Proxy Models
Taught by
Toronto Machine Learning Series (TMLS)