Jump to content

Backdoor poisoning attack: Difference between revisions

From Hackerpedia
imported>Unknown user
No edit summary
imported>Unknown user
No edit summary
(No difference)

Revision as of 02:53, 15 January 2026

Languages: English | Français

Backdoor poisoning attack

A poisoning attack that causes a model to perform an adversary-selected behaviour in response to inputs that follow a particular backdoor pattern.


Source: NIST AI 100-2e2025 | Category: