# AI Agents Can Be Hijacked Via System Prompt Poisoning - Date: 2026-04-01 - Category: Agentics A SQL injection let a red team poison McKinsey Lilli with one HTTP call. The security tools meant to stop this class of attack do not exist yet — and 40% of enterprise apps will have agents by 2026. ---