Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Stanford University

Robot Learning in the Era of Large Pretrained Models - Stanford Seminar

Stanford University via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intersection of robot learning and large pretrained models in this Stanford seminar featuring Dorsa Sadigh. Delve into the benefits of interactive robot learning in the context of foundation models, examining two key perspectives. Learn about the role of pretraining in developing visual representations and how language can guide the creation of grounded visual representations for robotics tasks. Investigate the importance of dataset selection during pretraining, including strategies for guiding large-scale data collection and identifying high-quality data for imitation learning. Discover recent work on enabling compositional generalization of learned policies through guided data collection. Conclude by exploring innovative ways to leverage the rich context of large language models and vision-language models in robotics applications. This 56-minute seminar, part of Stanford University's Robotics and Autonomous Systems series, offers valuable insights into the evolving field of robot learning and its integration with advanced AI models.

Syllabus

Stanford Seminar - Robot Learning in the Era of Large Pretrained Models

Taught by

Stanford Online

Reviews

Start your review of Robot Learning in the Era of Large Pretrained Models - Stanford Seminar

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.