Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Protecting Artists from AI Art Theft - Understanding Glaze and Nightshade Tools

The University of Chicago via YouTube

Overview

Explore a 29-minute video lecture featuring University of Chicago computer scientists Ben Zhao and Heather Zheng discussing innovative solutions to protect artists from AI exploitation. Delve into the development of two groundbreaking programs, Glaze and Nightshade, designed to create "poison pills" that safeguard original artwork from unauthorized use by AI image generators like Midjourney and DALL-E. Learn about the human cost of AI art generation, examining how these systems rely on uncredited artist works while potentially destroying artistic livelihoods. Through real-world examples, including Kelly McKiernan's story, understand the technical workings of these protective tools and their implications for the future relationship between human creativity and artificial intelligence. Examine the legal and ethical considerations surrounding AI art generation, and discover how these defensive technologies might reshape the landscape of digital art creation and ownership.

Syllabus

Introduction
The Human Element in AI Art
Power Dynamics and Consent Issues
AI's Impact on Artists and Industries
Meet the Defenders: Zhao and Zeng
Introducing Nightshade and Glaze
The Story of Kelly McKiernan
How Glaze and Nightshade Work
Legal and Ethical Implications
Future of AI and Human Creativity
Conclusion and Final Thoughts

Taught by

The University of Chicago

Reviews

Start your review of Protecting Artists from AI Art Theft - Understanding Glaze and Nightshade Tools

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.