Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Unraveling Multimodality with Large Language Models

Linux Foundation via YouTube

Overview

Explore the transformative role of Large Language Models (LLMs) in multimodality through this 38-minute conference talk by Alex Coqueiro from AWS. Gain insights into the contextual foundations and significance of multimodality, covering various data modalities and multimodal tasks. Discover cutting-edge multimodal systems, with a focus on Latent Diffusion Models (LDM) technologies using PyTorch, Langchain, Stable Diffusion, and LLaVA. Examine practical examples demonstrating the integration of multimodality techniques with LLaMa 2, Falcon, and SDXL, showcasing their impact on shaping the multimodal landscape.

Syllabus

Unraveling Multimodality with Large Language Models - Alex Coqueiro, AWS

Taught by

Linux Foundation

Reviews

Start your review of Unraveling Multimodality with Large Language Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.