Jump to content

Backdoor pattern: Difference between revisions

From Hackerpedia
imported>Unknown user
No edit summary
 
imported>Unknown user
No edit summary
 
(2 intermediate revisions by the same user not shown)
(No difference)

Latest revision as of 00:24, 20 January 2026

Languages: English | Français

Backdoor pattern

A transformation or insertion applied to a data sample that triggers an adversary-specified behaviour in a model that has been subject to a backdoor poi­soning attack. For example, in computer vision, an adversary could poison a model such that the insertion of a square of white pixels induces a desired target label.


Source: NIST AI 100-2e2025 | Category: