A finance professor has coined a term for the gap between what quantum computing announcements promise and what they actually demonstrate: quantum washing. Lionel Martellini, who directs the EDHEC Quantum Institute at EDHEC Business School and spent more than 30 years in financial engineering, introduced the term on a Quantum Computing Report podcast this week and immediately applied it to a specific case — IBM and Vanguard's 2024 announcement that their quantum computer had built a better bond ETF portfolio.
The timing is the story. IBM and Vanguard's announcement is five months old. The technical paper behind it has been public for seven months. What is new is Martellini's frame: a senior finance academic using the word "washing" in public, applied by name to a specific industry claim, backed by a precise mechanism.
The mechanism is the problem itself. The problem their 109-qubit Heron r1 processor solved — constructing a simplified bond ETF portfolio using up to 4,200 gates — is the kind of task that existing classical solvers handle in seconds, according to an arXiv benchmark study. The IBM Quantum Blog announcement called it a quantum milestone anyway.
The washing, as Martellini describes it, comes in two forms. The first is straightforward: companies claim quantum advantage today using hardware that won't be reliable until fault-tolerant quantum computers exist — a decade or more away. Current machines, known in the field as NISQ (Noisy Intermediate Scale Quantum) devices, are too constrained by computation size and noise to tackle production-scale financial problems, according to Polytechnique Insights. The second form is subtler and, in Martellini's view, more pervasive. It involves designing finance problems to look quantum-solvable when real-world practice never handles them that way.
The canonical example is portfolio optimization. The problem IBM and Vanguard framed as quantum — selecting which assets to include and computing the optimal weighting in a single step — is not, in actual financial practice, a single problem. Security selection and portfolio optimization are separate processes, driven by factor exposures, ESG mandates, and regulatory constraints. Practitioners solve them separately. The combined problem that appears in quantum finance papers is, as Martellini puts it, very rarely if ever the case in real portfolio management. It is a problem engineered to be quantum-solvable, not a problem finance actually has.
The genuine quantum advantage in finance requires that the economic gain from quantum speedup exceed the additional costs of quantum hardware and error correction. That math does not work yet. Quantum amplitude estimation, the algorithm that could deliver quadratic speedup for Monte Carlo pricing and risk management, requires fault-tolerant quantum computers with error-corrected qubits. Those do not exist. The CFA Institute noted in a recent research review that quantum computing will not remake finance overnight, and that near-term value will come from hybrid quantum-classical approaches rather than the dramatic speedups announced in press releases. A survey published this month in the journal MDPI Practical Applications of Quantum Computing in Finance reached the same conclusion.
The gap between announcement and technical reality is not unique to IBM and Vanguard. It is the shape of the field. The hardware is real. The timelines announced alongside it are not.