Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions

ChemicalQDevice via YouTube

Overview

Explore the cutting-edge application of tensor network substitutions in Large Language Models (LLMs) through this comprehensive seminar. Delve into the world of matrix product state tensor networks and their role in enhancing the explainability of natural language processing. Discover how tensor networks contribute to model compression, performance improvement, and increased controllability for high-dimensional datasets. Learn about recent advancements in substituting LLM layers with lightweight tensor networks, including Multiverse Computing's compression of LlaMA-2 7B and Terra Quantum's work on GPT-2small. Gain insights into specific strategies for recoding LLMs layer-by-layer using tensor networks, as detailed in current literature. Participate in a live question-and-answer session to deepen your understanding of this innovative approach to AI model optimization.

Syllabus

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions

Taught by

ChemicalQDevice

Reviews

Start your review of How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.