Jump to content

Jailbreak

From Hackerpedia

Languages: English | Français

Jailbreak

A direct prompting attack intended to circumvent restrictions placed on model outputs, such as circumventing refusal behaviour to enable misuse.


Source: NIST AI 100-2e2025 | Category: