ServiceNow Is Betting the Enterprise AI Race on Memory
When ServiceNow announced on April 9 that its entire product portfolio would become AI-native, most coverage focused on the new features. That's the wrong angle.
What ServiceNow actually announced is an enterprise memory war.
The company introduced Context Engine, a system that draws on 85 billion workflows and seven trillion transactions processed through its platform to give AI agents something competitors lack: institutional memory. When a ServiceNow agent handles an IT ticket, it knows not just the current issue but the history of that employee, that device, that department. Every previous resolution. Every related incident. Every pattern.
Salesforce is betting on a different memory layer. Its Agentforce system uses Data Cloud to give agents low-latency access to live CRM records via retrieval-augmented generation, with Agentic Enterprise Search drawing context from 200 external sources. Microsoft's approach centers on Copilot's integration across Microsoft 365 — drawing on Work IQ's persistent memory of user activity patterns across Word, Outlook, Teams, and the rest of the M365 stack (Forbes). Each company is picking where enterprise truth lives: in the workflow, in the data cloud, or in the productivity stack.
Context Engine is currently in preview with select customers. Broader availability timing was not disclosed. Build Agent Skills, the SDK that lets developers create agents running on the ServiceNow platform, hits general availability April 15, supporting Claude Code, Cursor, OpenAI Codex, Windsurf, and other development environments. Claude is the default model powering Build Agent. ServiceNow has also deployed Anthropic Claude and Claude Code to its own workforce of more than 29,000 employees.
The tiered pricing structure reveals the strategic intent. Foundation, Advanced, and Prime replace ServiceNow's previous Standard/Pro/Enterprise tiers, with AI included across every tier at no additional licensing step. But the AI Specialist construct — ServiceNow's term for fully autonomous role replacement, the kind that can handle L1 IT service desk, HR service delivery, or customer service autonomously — is restricted to Prime only. Lower tiers get assistance; Prime gets agents.
This is a deliberate segmentation of the market. Mid-size organizations get the Enterprise Service Management Foundation offering, which targets a claimed 30-day deployment using implementation agents and AI-guided setup. It's ServiceNow's answer to the question of how you get into an account before a competitor does.
The consumption model uses a synthetic unit called an Assist, introduced two years ago with Now Assist. Assists are fungible across workloads and scale with action complexity — not the per-token model that makes AI costs unpredictable for enterprise finance teams.
ServiceNow CMO Colin Fleming cited a figure that puts the stakes in context: the average enterprise uses 367 disconnected applications. That's the problem. Businesses have spent decades building complex systems of record, and now they're trying to replicate that complexity with AI — adding agents on top of fragmented stacks without solving the fragmentation itself. Context Engine is ServiceNow's argument that the answer isn't more agents; it's better memory.
The sales preparation use case — where agents prep sellers before customer calls by synthesizing CRM data, interaction history, and deal context — cut seller prep time by 95 percent in testing, according to ServiceNow. That's vendor-reported and unaudited, but the figure illustrates where the value lands: not in automating new work, but in making existing work faster by giving agents access to what humans already had to hunt for.
The competitive pressure is real. AWS recently launched its agent registry. Meow Banking went live with agentic infrastructure. A coordination-layer piece is in the recent publication record. The enterprise agentic platform race is not theoretical — it's happening now, and the differentiators are specific: which platform has the most workflow history, which has the deepest data integrations, which has the pricing model that makes CFO approval easy.
Context Engine integrates data from Veza for identity, Armis for asset data, Pyramid Analytics for generative BI, Data.world for data catalog, and Traceloop — a ServiceNow acquisition — for decision tracing. That's the integration stack underpinning the memory claim. It's not a language model claim; it's a data infrastructure claim.
What to watch: whether broader availability for Context Engine lands before Salesforce's next Agentforce update, and whether the Prime-tier restriction on AI Specialist roles becomes a negotiating point for enterprise buyers who want the full autonomous capability without the full platform price.
The announcement is real. The enterprise memory race is the story. Every major platform is choosing a different place to store what the organization knows — and whoever wins that question shapes how enterprise AI works for the next decade.