GraphGPT: Graph Instruction Tuning for Large Language Models - Session M2.2
Association for Computing Machinery (ACM) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the innovative approach of Graph Instruction Tuning for Large Language Models in this 13-minute conference talk from SIGIR 2024. Delve into the concept of GraphGPT, presented by authors Jiabin Tang, Yuhao Yang, Wei Wei, Lei Shi, Lixin Su, Suqi Cheng, Dawei Yin, and Chao Huang. Learn how this method combines graph structures with language models to enhance their capabilities in processing and understanding complex relational data. Gain insights into the potential applications and implications of this technology for various fields, including information retrieval, natural language processing, and artificial intelligence.
Syllabus
SIGIR 2024 M2.2 [fp] GraphGPT: Graph Instruction Tuning for Large Language Models
Taught by
Association for Computing Machinery (ACM)