Quantum computers still cannot break the encryption protecting your bank account. But two papers published this week bring the timeline closer, and one of them has a shareholder disclosure problem worth taking seriously before you reshare the headline.
The first paper, by researchers at Oratomic, the California Institute of Technology, and UC Berkeley, claims that Shor's algorithm—the math that makes quantum computers a threat to elliptic curve cryptography—could run at cryptographically relevant scale with as few as 10,000 reconfigurable neutral atom qubits, breaking ECC-256 in approximately ten days using 26,000 qubits, according to the Cain et al. preprint on arXiv. That is a genuine reduction from prior estimates, which ranged from hundreds of thousands to millions of physical qubits for equivalent work. The second paper, a Google blog post describing work done by the company's Quantum AI team, claims a different circuit architecture using fewer than 500,000 physical qubits could break ECC-256 in minutes, a reduction Google describes as approximately twenty-fold from prior ECC-256 estimates.
Both papers are posted to arXiv. Neither has been peer-reviewed, which matters because the Cain et al. paper has a disclosure that should appear in any responsible coverage: all nine authors are shareholders in Oratomic, the Quantinuum subsidiary where six of them work. That does not automatically invalidate the result. It does mean the authors have a financial stake in the field moving toward shorter timelines. The paper's peer reviewers, once they exist, should address this. In the meantime, the press release wrote itself.
Google's approach is harder to evaluate. Rather than publishing the full quantum circuits, the company released a zero-knowledge proof demonstrating the result without exposing the underlying mechanism. Google frames this as responsible disclosure, giving cryptographers time to prepare without handing attackers a blueprint. That is a defensible position. It also conveniently prevents anyone outside Google from independently verifying the qubit count, the gate depth, or the runtime estimate. Cryptographers are being asked to trust a proof they cannot inspect. In a field where quantum security claims routinely outlive the papers that made them, that ask deserves scrutiny.
The technical context is real. Estimated requirements for running Shor's algorithm have fallen five orders of magnitude in two decades, from roughly one billion physical qubits in 2012 to about 10,000 today, as algorithmic improvements and hardware assumptions tightened. Neutral atom qubit arrays have a structural advantage here: unlike superconducting qubits arranged in a two-dimensional grid with only four nearest-neighbor connections, neutral atom systems allow all-to-all interaction, meaning any qubit can talk to any other qubit without the routing overhead that plagues grid topologies. Whether that advantage survives the engineering realities of scaling neutral atom systems to tens of thousands of qubits is a separate question that neither paper addresses.
What the papers do not address is the machine that would run these circuits. The Cain et al. paper presents both estimates as physical qubit counts. The 10,000 and 26,000 figures already incorporate error correction via high-rate quantum error-correcting codes and efficient logical instruction sets as described in the paper. What the paper does not establish is whether the engineering required to operate a neutral-atom system at that scale is tractable within any near-term roadmap.
The Bitcoin ecosystem has the most immediate exposure. An estimated 6.9 million BTC is tied to early wallets and reused addresses that rely on elliptic curve cryptography. If those addresses are ever compromised, the transactions are irreversible—Bitcoin has no chargebacks, no fraud department, no recourse. RSA-2048, used widely by financial institutions for web security, would require approximately 102,000 qubits and roughly three months in a highly parallelized setup. That number should be read as a research estimate, not a countdown.
The honest version of this story is not "quantum computers just got much closer to breaking encryption." It is: two unpublished papers made bold claims about reduced qubit requirements for cryptanalysis, one comes from authors who own stock in a quantum company, the other withheld its technical details behind a proof, and the machines capable of running either attack do not yet exist. That is still worth knowing. The 20x reduction in qubit estimates is a real development for people building roadmaps. The shareholder disclosure is a real reason to hold your applause. And the fact that neither paper has been peer-reviewed is a real reason not to treat the numbers as settled.
Post-quantum cryptography standards from NIST are already being deployed precisely because the field has not waited for quantum computers to prove their capability before hardening against them. That migration is not optional waiting for a deadline. It is ongoing, for reasons that these papers reinforce even if they do not change the schedule.