Cloudflare's V8 isolates run AI agents 100x faster than containers
When an AI agent writes its own code on the fly, that code has to live somewhere.

image from FLUX 2.0 Pro
When an AI agent writes its own code on the fly, that code has to live somewhere. Cloudflare thinks it already has the answer: Dynamic Workers, a new API that spins up a V8 isolate — the same sandboxing engine that has powered Cloudflare Workers since 2018 — to run AI-generated code securely, at scale, with millisecond startup times. The company is calling it 100 times faster than containers.
Dynamic Worker Loader launched in open beta Thursday for all paid Workers users. The API lets a Cloudflare Worker instantiate a new Worker, in its own isolate, with code specified at runtime. Give the agent an RPC stub representing whatever APIs it should talk to, pass in its generated JavaScript, and call the agents entrypoint. The agent runs in a sandbox that cant reach the internet or access the parent applications secrets — unless the harness explicitly lets it through.
The 100x figure is the headline. Isolates take a few milliseconds to start and consume a few megabytes of memory; containers, by contrast, take hundreds of milliseconds and hundreds of megabytes. That gap matters when youre spinning up a new sandbox for every user request. Container providers impose global limits on concurrent sandboxes and creation rates. Dynamic Workers doesnt, because its the same technology Cloudflare has been running at internet scale for eight years. Want a million Dynamic Workers running simultaneously, one per request? As long as the underlying hardware exists in Cloudflares 300+ global locations, it just works.
The performance claim is consistent with known V8 isolate properties — Chrome has used the same approach since V8 shipped — but Cloudflare does not publish comparative benchmarks in the blog post. Giskard should treat this as a directional claim supported by architecture, not a benchmark-verified figure.
The more interesting technical story is how the API is designed. Rather than exposing a flat list of tool calls — the MCP model — or a verbose OpenAPI spec, Cloudflare recommends TypeScript interfaces. The argument: a ChatRoom interface describing three methods runs in a handful of tokens; the equivalent OpenAPI spec sprawls across a full screen. For agents that pay per token, thats not a cosmetic difference. Cloudflares own Code Mode work showed that converting an MCP server into a TypeScript API cut token usage by 81 percent.
The Capn Web RPC bridge handles the cross-boundary calls. The agent calls a TypeScript method; the RPC library on both sides handles serialization, transport, and the security boundary. The agent never knows its talking to a remote stub rather than a local library.
For credential injection, Dynamic Workers supports a globalOutbound callback. When the sandboxed code makes an HTTP request, the harness can intercept it — inspect the request, rewrite headers, inject an auth token — before the request goes out. The agent never sees the secret. Its a cleaner security model than trying to filter HTTP requests after the fact, since narrowing an interface to specific capabilities is easier than intercepting and interpreting all possible API calls.
The broader context comes from Cloudflare CEO Matthew Prince, speaking at SXSW this week: AI bot traffic will exceed human web traffic by 2027, he said, because agents visit far more sites per task than people do. Shopping for a camera: five sites for a human, up to 5,000 for an agent. At that scale, containers are the wrong tool. You cant keep them warm for every possible task; reusing them across requests compromises isolation. The agent economy needs something cheaper and faster to start. V8 isolates, it turns out, have been that thing all along.
The authors of Dynamic Workers are Kenton Varda, Sunil Pai, and Ketan Gupta.

