Jump to content

Backdoor pattern

From Hackerpedia
Revision as of 01:42, 15 January 2026 by imported>Unknown user

Languages: English | Français

Backdoor pattern

A transformation or insertion applied to a data sample that triggers an adversary-specified behaviour in a model that has been subject to a backdoor poi­soning attack. For example, in computer vision, an adversary could poison a model such that the insertion of a square of white pixels induces a desired target label.


Source: NIST AI 100-2e2025 | Category: