Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the potential security risks associated with GitHub Copilot, an AI-based pair programming tool, in this 37-minute Black Hat conference talk. Delve into the vulnerabilities that can arise from code suggestions generated by Copilot, including SQL injections, buffer overflows, use-after-free issues, and cryptographic problems. Learn how the vast amount of open-source code used to train Copilot, including potentially buggy and insecure code, impacts the reliability of its suggestions. Examine the presenters' findings on Copilot's susceptibility to generating vulnerable code across multiple dimensions and with various prompts. Understand the implications of automation bias in AI-assisted coding and gain insights on how to approach and mitigate these risks when using such tools in software development.
Syllabus
Introduction
Stack Overflow
GitHub Copilot
Copilots
Overview
What is Copilot
How does it generate code
Whats the problem
Whats the solution
Three dimensions
Diversity of weakness
What we saw
Diversity of Prompt
Mucking around with Prompt
Results
Example
Other Findings
Why Should You Care
Automation Bias
What should you do
Taught by
Black Hat