Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension

Association for Computing Machinery (ACM) via YouTube

Overview

Learn about an innovative approach to machine reading comprehension through this 21-minute conference presentation from SIGIR 2024. Explore the MTMS (Multi-teacher Multi-stage Knowledge Distillation) framework for reasoning-based machine reading comprehension, presented by researchers Zhao Zhuo, Xie Zhiwen, Zhou Guangyou, and Huang Xiangji. Dive into how multiple teacher models and staged knowledge distillation techniques can enhance reading comprehension systems' reasoning capabilities, as demonstrated in this Association for Computing Machinery (ACM) session focused on Question Answering and Summarisation.

Syllabus

SIGIR 2024 W1.4 [fp] MTMS: Multi-teacher Multi-stage Knowledge Distillation

Taught by

Association for Computing Machinery (ACM)

Reviews

Start your review of Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.