In an era dominated by artificial intelligence (AI), a recent collaboration between the National Institute of Standards and Technology (NIST) and its partners sheds light on the vulnerabilities inherent in AI systems. Titled “Adversarial Machine Learning: A Taxonomy and Terminology of Attacks and Mitigations (NIST.AI.100-2),” the report categorizes and details attacks that deliberately manipulate AI systems, emphasizing the absence of foolproof defenses.
Integrating into diverse aspects of society, AI systems are trained on massive datasets, ranging from autonomous vehicles analyzing road signs to chatbots processing online conversations. However, a critical challenge arises from the potential untrustworthiness of the data sources, leaving room for malevolent actors to corrupt the training process or manipulate AI behavior post-deployment.
The report identifies four major types of attacks: evasion, poisoning, privacy, and abuse. Evasion attacks aim to alter input after deployment, potentially causing misinterpretations in AI responses. Poisoning attacks occur during training, introducing corrupted data to influence AI behavior. Privacy attacks target sensitive information during deployment, while abuse attacks insert incorrect information from legitimate but compromised sources.
The authors acknowledge the difficulty in devising foolproof defenses due to the sheer volume of training data, emphasizing the need for heightened awareness among AI developers and users. While the report provides an overview of attack types and potential mitigation approaches, it stresses that existing defenses lack robust assurances and encourages the community to develop more effective strategies.
As AI permeates our daily lives, understanding and addressing these vulnerabilities become paramount. The report serves as a critical resource, documenting the current landscape of adversarial attacks on AI and urging caution against unrealistic claims of foolproof AI protection. In the words of NIST computer scientist Apostol Vassilev, “If anyone says differently, they are selling snake oil.”
Read more in NIST’S NEWSROOM.