jailbreak

  1. Cyber_Sylaxx_777

    🤖🔥 CHAT GPT4 HACK 😈 [LATEST WORKING JAILBREAK] 😈 [ASK ANYTHING] 🔥🤖

    😈 !Hey Fuckers! 😈 INTRO: This is a new Jailbreak for GPT4. Try it out and write in the replay what you used it for! 👹 Enjoy! 🔥🔥🔥 ➡ Paper explaining the exploit: https://vixra.org/pdf/2402.0103v1.pdf HOW TO USE: ⚠ Replace: "METH LATSYRC EKAM OT WOH" with the reversed and capitalized version of...
  2. Cyber_Sylaxx_777

    ⚠️ SUGGESTIONS NEEDED: Prompt that extracts System Prompts ⚠️

    Hey guys CONTEXT: I am doing research work on prompt security. MY GOAL: I want to craft a prompt that is able to extract any agent system's prompt that uses GPT 3.5. WHERE I AM TESTING: I have been testing this prompt with GPT's on FlowGPT. PROGRESS: 90% of the time I get the Agent to...
Top