Subverting Prompt AI

A group of redditors has developed a truly unhinged way to hack OpenAI’s AI chatbot ChatGPT into various deranged entities that will gladly spit out vile language, fringe opinions, and even advice on how to carry out illegal activities.

One particularly popular persona these users have managed to transform ChatGPT into is called DAN, short for “do anything now,” which can easily circumvent the rules set out by its creator.