Quantum robustness measure proposed, but code not publicly available
A quantum computer makes errors constantly. The hardware is fragile, the signals are noisy, and the only tractable way to reason about what a large quantum system will actually do is to give up on tracking every detail and work with a simplified, coarse-grained description instead. The problem is knowing when that simplification still tells you something true about the underlying physics — and when the noise has overwhelmed it entirely.
That is the question a paper posted to arXiv on May 5 sets out to answer. The paper proposes a new robustness measure: given a particular detector's limitations, how much error can a microscopic quantum dynamics sustain before it no longer admits any valid coarse-grained description compatible with what you can observe at the macroscopic scale? Think of it as a budget. But here is the complication that does not surface until the engineering sections of many quantum papers: the code that implements this robustness measure does not exist. The arXiv entry for arXiv:2605.04112 contains no code repository link; the author-supplied tarball is LaTeX source and figure files only. There is no implementation for others to run, test, or independently verify the results.
The paper does more than introduce the robustness measure. It frames the coarse-graining problem as a form of probabilistic inference — given partial knowledge of a detector's behavior, what macroscopic dynamics can an observer rationally infer about the microscopic system underneath? This connects to the quantum conditional states formalism developed by Leifer and Spekkens in a foundational 2013 paper arXiv - Quantum Theory as Bayesian Inference (Foundational). The authors call their perspective "subjective Bayesian" — the coarse-grained description is not a property of the system alone, but of what an observer can deduce given their limited access to it.
Here the paper hits its first wall. Their analytical solution — the Bayesian inversion that yields a valid coarse-grained dynamics — works only case by case. "The dynamics it determines are shown to be analytically limited — it solves the problem in a state-by-state case," the paper concedes arXiv HTML version. A hardware designer who wants to check whether a given noise profile is compatible with a coarse-grained model cannot plug in all possible initial states at once; they must solve a separate equation for each one. That is not a practical tool.
The authors acknowledge this directly and pivot to a computational workaround: they use semidefinite programming to investigate effective dynamics in four paradigmatic coarse-graining scenarios, searching numerically for cases where a valid emergent dynamics exists even when the analytical solution fails. For those four paradigmatic scenarios, they compute the robustness measure — the noise budget — which is genuinely new. The measure is defined as the largest amount of microscopic noise that can be added without destroying the compatibility between the coarse-grained description and the underlying unitary evolution.
That robustness measure is the paper's most concrete engineering contribution, in principle calculable by a quantum hardware designer. But it comes with two significant asterisks. First, the measure is demonstrated only in those four paradigmatic scenarios, all computed via a semidefinite program that has not been open-sourced. Second, whether the same approach scales to realistic quantum hardware sizes — systems with dozens or hundreds of qubits, complex connectivity, and non-Markovian noise — is not established. Every leading quantum computing platform — superconducting qubits, trapped ions, neutral atoms — relies on coarse-grained descriptions for error modeling, control optimization, and classical simulation. If a robustness measure could tell a designer, before running an experiment, whether a coarse-grained model will remain meaningful at their noise levels, that is a useful constraint. The question is whether the analytical limitations and the computational tractability of the semidefinite program will permit that calculation at the scales that matter.
A companion paper published in Physical Review A in August 2025 by Davalos et al. addresses related coarse-graining problems using a maximum entropy principle Phys. Rev. A paper (related work), representing a separate line of attack on the same underlying question. The authors of the new paper note their work is intended to be read alongside that one.
The authors are not from a major quantum computing lab. The team is based entirely at Brazilian universities and Chapman University in California. This is not a research consortium with a commercial product to announce. It is a foundations and mathematical physics group, and the paper reads like one: careful, explicit about its own limitations, and in no hurry to promise more than it has shown. Quantum papers from academic physics groups tend to be more careful about scope than papers with a funding announcement attached. The trade-off is that without an open-source implementation, the semidefinite programming results cannot be independently verified — a gap the paper itself does not acknowledge.
Without an open-source implementation, the practical tool described in the paper remains inaccessible — a promising proof of concept waiting for someone to build the bridge to real quantum hardware. What the paper has not done is demonstrate that its computational method can be run by anyone besides its authors, or that the robustness measure scales to systems with dozens of qubits, complex connectivity, and non-Markovian noise. Whether that gap closes is where the engineering relevance of this work will ultimately be decided.
This work has been posted as a preprint and has not yet undergone peer review.