Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension
Association for Computing Machinery (ACM) via YouTube
Overview
Learn about an innovative approach to machine reading comprehension through this 21-minute conference presentation from SIGIR 2024. Explore the MTMS (Multi-teacher Multi-stage Knowledge Distillation) framework for reasoning-based machine reading comprehension, presented by researchers Zhao Zhuo, Xie Zhiwen, Zhou Guangyou, and Huang Xiangji. Dive into how multiple teacher models and staged knowledge distillation techniques can enhance reading comprehension systems' reasoning capabilities, as demonstrated in this Association for Computing Machinery (ACM) session focused on Question Answering and Summarisation.
Syllabus
SIGIR 2024 W1.4 [fp] MTMS: Multi-teacher Multi-stage Knowledge Distillation
Taught by
Association for Computing Machinery (ACM)