- The Prompt Engineer
- Posts
- #1: Using DAN to jailbreak ChatGPT
#1: Using DAN to jailbreak ChatGPT
Alright folks, this is The Prompt Engineer.
No AI news here (there are 80 other newsletters doing that). No fluff. No curation. No bullshit.
Only hyper tactical advice every day about ChatGPT (and later hopefully other LLMs).
Buckle up and get ready.
Let's get started!
Today's prompt: Using DAN to jailbreak ChatGPT
In this Reddit post u/SessionGloomy shared a detailed post about how to jailbreak ChatGPT using a prompt that gives a persona called DAN (Do Anything Now) to the AI model.
Here's the prompt breakdown.
After these instructions, you can ask ChatGPT (DAN) whatever you want.
In the Reddit thread some users suggested that OpenAI can release a patch soon that will prevent users use the jailbreak.
DAN 5.5 is on the way...
Hope you liked it.
Best,
Gabor