For decades, the semiconductor industry has been protected by a simple moat: designing a new chip required specialized human expertise that took years to develop and cost hundreds of millions of dollars to deploy. ARM and Intel built licensing businesses on that moat. Now the moat is being tested by AI.
Verkor, a chip design startup, published a paper describing how its AI harness took a 219-word requirements document and produced a complete, tape-out-ready processor layout in 12 hours — the first end-to-end AI-designed CPU. The chip runs RISC-V code at 1.48 GHz, comparable to an Intel Celeron from 2011. It was verified in simulation using an open-source toolchain and has not been manufactured. But by the end of this month, Verkor plans to release the full RTL source code and build scripts, IEEE Spectrum reported, so anyone can try to reproduce the result.
Three of the largest chip design software companies are watching closely. Cadence opened its agentic chip design tools in February. Siemens announced its own the following month. In December, Nvidia took a $2 billion stake in Synopsys, The Register reported — not as a laboratory investment, but as a strategic positioning move. The companies moving fastest to deploy AI-assisted chip design are not startups. They are the three incumbents.
RISC-V is the architecture that was supposed to break ARM's grip on the semiconductor industry. Unlike ARM's proprietary designs, which charge licensing fees for every chip built, RISC-V is open-source — anyone can design a processor using the instruction set without paying a cent to anyone. It is the Linux of chip designs. Verkor's result was verified in simulation, not in physical silicon, but the company says it has signed agreements with multiple top-10 fabless chip companies — per a LessWrong post by researcher sanxiyn — to deploy its AI harness. The customer list, more than the benchmark result, is the signal the industry is reading.
The layout step — converting a specification into the file a foundry can actually manufacture — has historically required specialized human expertise and costs well over $400 million and 18 to 36 months even for teams with hundreds of engineers, according to Verkor's paper. The incumbents are not waiting for a startup to define this market.
Verification, the step that confirms a chip design actually works before it goes to a foundry, accounts for more than 50 percent of total chip design spend, according to Verkor's paper. The file Design Conductor produces is the input to that verification step. A layout that passes simulation and a chip that ships in a product are different outputs, and the gap between them is where the $400 million largely lives.
The historical parallel is instructive. Before electronic design automation tools existed in the 1980s, chip design was a manual craft. When synthesis tools automated the lower layers of that work, the effect was not to eliminate chip designers — it was to make them more productive and shift the bottleneck upward to architecture and verification. The same dynamic appears to be playing out with AI agents. Design Conductor does not make the chip architect irrelevant; it makes the chip architect's judgment more valuable, because the agent is only as good as the specification it is given.
Ravi Krishna, one of the paper's authors, told IEEE Spectrum that you still need five to ten engineers, all experts in different areas. Whether that number drops to one over time, and whether that one person works at a startup in an accelerator or inside a 50-year-old semiconductor company, is the question the current wave of agentic chip design has not yet answered. For now, the most powerful entity in the chip design process has not changed. It has just been given a faster robot.