Jump to content

Jailbreak

From Hackerpedia
Revision as of 02:53, 15 January 2026 by imported>Unknown user

Languages: English | Français

Jailbreak

A direct prompting attack intended to circumvent restrictions placed on model outputs, such as circumventing refusal behaviour to enable misuse.


Source: NIST AI 100-2e2025 | Category: