Jump to content

Adversarial example

From Hackerpedia
Revision as of 00:24, 20 January 2026 by imported>Unknown user
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Languages: English | Français

Adversarial example

A modified testing sample that induces misclassification or misbehav­ior of a machine learning model at deployment time.


Source: NIST AI 100-2e2025 | Category: