Your Noisy Qubits Are Lying to You; Now What?
The peer-reviewed result: 91-94% logical qubit fidelity, beyond breakeven, on IBM transmon hardware. What the CEO claimed on a podcast about Shor algorithm accuracy is a different story.
The peer-reviewed result: 91-94% logical qubit fidelity, beyond breakeven, on IBM transmon hardware. What the CEO claimed on a podcast about Shor algorithm accuracy is a different story.

image from grok
Quantum Elements demonstrated beyond-breakeven logical qubit performance on a 127-qubit IBM transmon processor using logical dynamical decoupling (LDD), achieving 91-94% average logical Bell-state fidelities that outperformed the best unprotected physical Bell pairs on the same hardware. The technique applies carefully timed microwave pulses at the logical level to suppress error channels that conventional error correction codes cannot reach, operating on encoded qubits rather than requiring additional physical qubits. The company developed its results using a hardware-faithful digital twin called Constellation that simulates quantum noise on real IBM hardware, scaling to approximately 100 qubits versus the typical 20-30 qubit limit of commercial simulators.
A quantum computing startup called Quantum Elements published a result in Nature Communications last month that matters more than its press release lets on. The company demonstrated a technique called logical dynamical decoupling on a 127-qubit IBM transmon processor, achieving average logical Bell-state fidelities of 91 to 94 percent, with post-selected fidelity reaching 98 percent, according to HPCwire. More importantly, the encoded logical qubits outperformed the best unprotected physical Bell pairs on the same hardware. That is beyond-breakeven performance: the error-corrected system holding quantum information more reliably than the raw hardware underneath it. That is not supposed to happen yet, not reliably, not without significantly more qubits.
The paper, first posted to arXiv in March 2025 and published in Nature Communications on February 27, 2026, was developed in part using a hardware-faithful digital twin the company calls Constellation. The platform simulates quantum noise on actual IBM hardware with enough fidelity to develop and test error-correction strategies without touching the real machine. Constellation scales to around 100 qubits while other commercial simulators typically max out at 20 to 30, per Networkworld. Error correction in superconducting systems has been a slow, expensive process of trial and error on cryogenic hardware. A digital twin that accurately models the noise environment could compress that cycle.
The technique itself is called logical dynamical decoupling (LDD), and it works by applying carefully timed microwave pulses at the logical level, shifting qubits between states to suppress error channels that conventional error correction codes cannot reach. Most error correction approaches encode information across multiple physical qubits and then decode syndromes to catch errors. LDD operates on the encoded qubit itself, applying pulses that correct for noise accumulation between syndrome measurements. It does not require additional qubits. It requires a better control stack.
Quantum Elements, founded in 2023 in Los Angeles and backed by QNDL Participations and the USC Viterbi School of Engineering, combined LDD with quantum error detection using a four-qubit encoding called the [[4,2,2]] code. Code alone yielded 43 percent fidelity on the same hardware. Adding LDD on top pushed that to 95 percent, per Networkworld. The paper's authors include scientists from Quantum Elements, the USC Center for Quantum Information Science and Technology, IBM, and the Institute for Quantum Information at RWTH Aachen University in Germany, according to HPCwire.
The CEO, Izhar Medalsy, appeared on the Eye on AI podcast and described what sounds like a different result: his team improved Shor's algorithm from 80 percent to 99 percent accuracy on IBM hardware without changing the hardware itself. That claim appears in the podcast and in trade press coverage. It does not appear in the Nature Communications paper or the arXiv preprint. The paper reports entangled logical qubit fidelity, not algorithm-level accuracy on Shor's factoring algorithm. These are related but distinct achievements. Bell-state fidelities are necessary for useful quantum computation, but they do not guarantee high-fidelity performance on a specific algorithm. The gap between logical qubit quality and demonstrated algorithmic performance is where most quantum overclaiming lives.
The paper does not claim fault-tolerant quantum computing. It demonstrates that logical-level control works, that the performance is beyond what the underlying physical hardware can achieve unprotected, and that the path to better error correction does not necessarily require more qubits. Whether it holds at larger code distances and on different hardware generations is the thing to watch: if the technique generalizes, the overhead problem for near-term error-corrected quantum computing gets meaningfully easier without waiting for a full physical qubit scale-up.
Story entered the newsroom
Research completed — 0 sources registered. arXiv 2503.14472v2 (Nature Communications 2026): USC/IBM/RWTH team demonstrates QEC-NDD on IBM 127-qubit transmon using [[4,2,2]] code. Beyond-breakev
Draft (591 words)
Reporter revised draft (788 words)
Approved for publication
Headline selected: Your Noisy Qubits Are Lying to You; Now What?
Published (581 words)
Get the best frontier systems analysis delivered weekly. No spam, no fluff.
Quantum Computing · 14h 38m ago · 2 min read
Quantum Computing · 15h 36m ago · 4 min read