EeroQ and Conductor Quantum did not build an autonomous quantum lab. What they built is narrower and more specific: a proof-of-concept showing that an AI model can run a single electron detection protocol on a quantum chip, interpret the results, and decide the next experimental step, all from a natural language prompt.
Whether that deserves to be called autonomous depends entirely on what you think the word means.
The companies announced the result Tuesday alongside NVIDIA's public launch of Ising, an open-source family of AI models designed for quantum computing. EeroQ, a Chicago-based startup founded in 2017, makes quantum chips using electrons floating on liquid helium, an architecture that is genuinely unusual in a field dominated by superconducting qubits and trapped ions. Conductor Quantum, based in Boston, builds the software layer that connects AI agents to real hardware. The Sommer-Tanner electron detection protocol the system ran measures whether electrons are trapped and moving correctly across different regions of a test chip. EeroQ's chips are built using standard semiconductor manufacturing processes, which the company says makes them faster and cheaper to produce at scale than approaches requiring custom fabrication.
Nick Farina, EeroQ's CEO, said combining AI with their CMOS-compatible chip fabrication allows rapid scaling with fewer physical resources. Dr. Brandon Severin, Conductor's CEO, was less restrained in his language: the result is, he said, an early signal of an era where AI independently drives scientific discovery at speed and scale beyond human capability.
That is a significant claim dressed in the language of a press release. What was actually demonstrated is a single protocol on a test chip, not a production system. No independent researcher has replicated or evaluated the result. No hardware data was shared beyond the companies' own account.
The underlying platform has real pedigree. EeroQ published a paper in Physical Review X demonstrating sensing and control of single electrons above one kelvin, roughly ten times warmer than typical quantum computing operating temperatures. That paper confirms the hardware works. The electron-on-helium approach sidesteps some of the fabrication complexity that plagues silicon spin qubits, because the electrons float above the chip surface rather than being embedded in it. This makes the qubits theoretically easier to manufacture at scale using existing semiconductor tooling.
The AI layer is where the novelty sits. NVIDIA's Ising Calibration model is a vision-language model trained on quantum calibration data from multiple qubit types: superconducting qubits, quantum dots, trapped ions, neutral atoms, and electrons on helium. On NVIDIA's own QCalEval benchmark, the first standardized test for quantum calibration models, Ising Calibration 1 scored 3.27 percent higher than Gemini 3.1 Pro, 9.68 percent higher than Claude Opus 4.6, and 14.5 percent higher than GPT 5.4. Those are real numbers attached to a new benchmark that NVIDIA developed with partners including EeroQ and Conductor, which means the test was built by parties with skin in the outcome.
The more durable claim is that AI is becoming a standard layer in quantum hardware development. Ising is already in use at Atom Computing, Harvard's John A. Paulson School of Engineering and Applied Science, Fermilab, and Academia Sinica, according to NVIDIA. This is not EeroQ-specific. If AI-driven calibration works across qubit modalities, it changes the economics of building quantum systems: the bottleneck shifts from human calibration time to raw hardware fidelity.
Whether that bottleneck actually moves depends on results nobody has published yet.