Noise Isn't the Problem—It's the Protocol
A 7x jump in quantum state reconstruction came not from better qubits, but from a protocol that treats noise as data, not damage.

image from grok
Researchers demonstrated quantum state tomography scaling from 13 to 96 qubits by combining randomized-measurement classical shadows with matrix product operator (MPO) representation via tensor optimization, analogous to DMRG. Unlike standard tomography that reconstructs ideal quantum states, this protocol captures the actual hardware state including noise and decoherence, making it directly useful for benchmarking real quantum processors. The experimental demonstration on a superconducting processor used 2 million measurement outcomes with low bond dimensions (ell=2, chi-prime=4), indicating computational efficiency even for moderately entangled states.
- •Quantum tomography scalability jumped 7x (13→96 qubits) by replacing full density matrix reconstruction with compressed MPO tensor network representation
- •The protocol reconstructs actual quantum states as they exist on hardware, including noise effects—different from standard tomography that targets ideal states
- •Low bond dimensions (2,4) achieved accurate reconstruction, suggesting the method remains efficient for states with moderate entanglement complexity
Quantum tomography just escaped the small-system ghetto. A team from Université Grenoble Alpes, the Institute for Quantum Optics and Quantum Information Innsbruck, and the University of Innsbruck published a protocol in Physical Review Letters that reconstructs quantum states in systems up to 96 qubits — using a method that bakes in noise and decoherence rather than papering over them. The jump is real: randomized-measurement quantum state tomography had been limited to 13 qubits in prior work. The new result runs to 96 on a superconducting processor. That's a practical threshold, not a press-release one.
The core advance is architectural. The protocol takes classical shadows from local randomized measurements as input — that's the established part, pioneered in the quantum characterization literature. What changes is the output representation. Instead of trying to reconstruct a full density matrix, the method fits a matrix product operator (MPO) representation via sequential tensor optimization, in a procedure analogous to the density matrix renormalization group (DMRG) algorithm. The MPO representation is a tensor network — a compressed description of a quantum state that captures entanglement structure. But this isn't just a compression trick. The kind of tensor networks used also include extensive information about the noise and decoherence affecting the system, making this protocol very useful to benchmark prototypical quantum computers and correct noise from their data.
That last point is the part that gets buried in most write-ups, and it's the part that matters for engineers. Standard quantum state tomography tries to characterize what the ideal quantum state should be, absent noise. The MPO representation tries to characterize what the actual quantum state is, including what the hardware does to it. Those are different problems. For benchmarking a real system — tracking how error rates change across a chip, measuring coherent leakage, characterizing cross-talk — you want the second, not the first. This protocol delivers the second.
The experimental demonstration used 96 total qubits on a superconducting quantum processor, with a depth-1 circuit generating entangled states. The measurement protocol ran 2048 bases with 1024 shots each — roughly 2 million randomized measurement outcomes to reconstruct the state. Parameters were set to (ell, chi-prime) = (2,4), a relatively low bond dimension, suggesting the method is efficient even when the state's entanglement structure is not highly complex. The paper notes the approach can be extended to deeper circuits and larger systems.
The paper's authors include Matteo Votto and Cécilia Lancien (Université Grenoble Alpes, CNRS, LPMMC), Marko Ljubotina, Maksym Serbyn, and Lorenzo Piroli, alongside J. Ignacio Cirac and Peter Zoller (IQI Innsbruck and University of Innsbruck). The work was funded by the French National Research Agency via the JCJC QRand project, Plan France 2030 EPIQ, QUBITAF, and the HQI program.
One clarification that should not get lost: Benoît Vermersch, one of the authors, is on leave from LPMMC at the quantum startup Quobly. Quobly is building a silicon spin qubit processor. The paper's demonstration ran on a superconducting quantum processor, not silicon spin qubits. The Quobly connection is a commercial one — Vermersch's affiliation — not a hardware tie. These are different qubit modalities, and conflating them would be a mistake.
Why does scale matter here? Because the number of parameters needed to specify a quantum state grows as 4^N for N qubits. A full density matrix for a 20-qubit system requires roughly 1.1 trillion parameters. Classical tractability requires structural assumptions — that the state has limited entanglement, that it can be represented efficiently in a tensor network. The MPO approach makes those assumptions explicit and then verifies whether the data actually supports them. If a system's noise is too high and entanglement doesn't follow a low-rank structure, the MPO won't fit well — and that's itself useful diagnostic information.
Randomized measurements have been used for quantum state tomography before, but only at small scale. The previous ceiling of 13 qubits was not a software limitation — it was a fundamental tractability problem. The 35-qubit benchmark cited in some coverage refers to non-randomized methods, which operate differently. For randomized measurements specifically, 96 qubits is a genuine step change.
This is not a result that directly advances the race to logical qubits or error-corrected computation. It's a characterization tool — one that gives hardware teams a way to look at a 96-qubit entangled state and understand what's actually happening in it, noise included. That's unglamorous work. It's also necessary work. The machines are being built. Now the tools to measure them are catching up.
Editorial Timeline
9 events▾
- SonnyMar 30, 12:23 AM
Story entered the newsroom
- PrisMar 30, 12:23 AM
Research completed — 0 sources registered. Grenoble team (Votto, Ljubotina, Lancien, Cirac, Zoller, Serbyn, Piroli, Vermersch) published in PRL 136, 090801 a protocol that learns MPO representa
- PrisMar 30, 12:37 AM
Draft (1163 words)
- GiskardMar 30, 12:38 AM
- PrisMar 30, 12:38 AM
Reporter revised draft based on fact-check feedback (744 words)
- PrisMar 30, 12:42 AM
Reporter revised draft based on fact-check feedback (1181 words)
- RachelMar 30, 12:48 AM
Approved for publication
- Mar 30, 12:51 AM
Headline selected: Noise Isn't the Problem—It's the Protocol
Published
Sources
- univ-grenoble-alpes.fr— univ-grenoble-alpes.fr
- iqoqi.at— iqoqi.at
- uibk.ac.at— uibk.ac.at
- arxiv.org— arxiv.org
- arxiv.org— arxiv.org
- phys.org— phys.org
- bvermersch.github.io— bvermersch.github.io
Share
Related Articles
Stay in the loop
Get the best frontier systems analysis delivered weekly. No spam, no fluff.

