Completed
Prompt Injection / JailBreaking a Banking LLM Agent (GPT-4, Langchain)
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Prompt Injection and Jailbreaking Techniques for Banking LLM Agents - Security Demonstration
Automatically move to the next video in the Classroom when playback concludes