Backdoor pattern: Difference between revisions
Appearance
imported>Unknown user No edit summary |
imported>Unknown user No edit summary |
||
| (One intermediate revision by the same user not shown) | |||
(No difference)
| |||
Latest revision as of 00:24, 20 January 2026
Backdoor pattern
A transformation or insertion applied to a data sample that triggers an adversary-specified behaviour in a model that has been subject to a backdoor poisoning attack. For example, in computer vision, an adversary could poison a model such that the insertion of a square of white pixels induces a desired target label.
Source: NIST AI 100-2e2025 | Category: