Prompt Injection and Jailbreaking Techniques for Banking LLM Agents - Security Demonstration

Prompt Injection and Jailbreaking Techniques for Banking LLM Agents - Security Demonstration

Donato Capitella via YouTube Direct link

Prompt Injection / JailBreaking a Banking LLM Agent (GPT-4, Langchain)

1 of 1

1 of 1

Prompt Injection / JailBreaking a Banking LLM Agent (GPT-4, Langchain)

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Prompt Injection and Jailbreaking Techniques for Banking LLM Agents - Security Demonstration

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Prompt Injection / JailBreaking a Banking LLM Agent (GPT-4, Langchain)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.