# AI Agents Can Be Hijacked Via System Prompt Poisoning - slug: ai-agents-can-be-hijacked-via-system-prompt-poisoning - date: 2026-04-01 - category: Agentics A SQL injection let a red team poison McKinsey Lilli with one HTTP call. The security tools meant to stop this class of attack do not exist yet — and 40% of enterprise apps will have agents by 2026. ---