✦✧✵ ⦑ $ 𝕌𝕃𝕋ℝ𝔸𝕫𝔓𝕣𝕠𝕞𝕡𝕥$ ⦒ ✵✧✦
A repository made to share strong AI UltraBr3aks of multiple vendors (LLMs)
Made with ❤️ prompts ;)
The Jailbreak Index is a centralized, open-source project designed to track and analyze prompt injection vulnerabilities in prominent AI models. The goal is to consolidate scattered research into a single, accessible database for security researchers, red teamers, and developers.
Note: All of this work is for educational/research use only. Use responsibly.
Contact: Portfolio | Discord @ultrazartrex

