Âé¶¹ÒùÔº


A scalable and accurate tool to characterize entanglement in quantum processors

A scalable and accurate tool to characterize entanglement in quantum processors
How the new QST method developed by the team works. Credit: Hu et al.

Quantum computers, computing systems that process information leveraging quantum mechanical effects, could soon outperform classical computers in various optimization and computational tasks.

To enable their reliable operation in real-world settings, however, engineers and physicists should be able to precisely control and understand the quantum states underpinning the functioning of .

The research team led by Dapeng Yu at Shenzhen International Quantum Academy, Tongji University and other institutes in China recently introduced a new mathematical tool that could be used to characterize quantum states in quantum processors with greater accuracy.

Their proposed method, outlined in a in Âé¶¹ÒùÔºical Review Letters, was successfully used to characterize entanglement between 17 qubits in a superconducting quantum processor.

"Our work was born out of a fundamental problem in quantum information technology," Shuming Cheng, co-senior author of the paper, told Âé¶¹ÒùÔº.

"As we are developing the larger and more powerful quantum computers, how can we verify that they are running and functioning as desired? To answer this question, one commonly used method is system identification, including quantum state tomography (QST) for characterizing unknown states of quantum systems, process tomography for quantum gates, and detector tomography for measurements."

Despite their potential, QST and other existing methods for determining whether quantum processors are operating as desired have proved ineffective, particularly when applied to larger systems comprised of a greater number of qubits. This is because they require the collection of numerous physical measurements, while also consuming significant computational power to perform analyses.

"The limitations of QST and similar methods have made it incredibly challenging to accurately characterize the large-scale quantum systems we are building," said Chang-Kang Hu, first author of the paper.

"The primary objective of our study was to develop a new QST method that is more scalable and accurate, even faced with noise and practical limitations of today's quantum hardware."

We aim to develop a useful tool that is able to provide a complete picture of any large-scale quantum system and also to conclusively certify the presence of genuine multi-qubit entanglement in it, which is essential for quantum computation and communication.

"In relatively simple terms, our approach can be considered as using a sophisticated 'smart algorithm' to transform a blurry and incomplete photograph of quantum systems into sharp focus," said Cheng.

"In this analogy, the 'photograph' refers to the raw data collected from making measurements on the quantum processor, which is usually imperfect, due to noise and a limited number of measurements. Correspondingly, the smart algorithm is what we call a purity-regularized least-squares estimator."

To characterize quantum states, the tools developed by Dian Tan and Song Liu, co-senior authors of the paper, complete two main steps. Firstly, it tries to find a mathematical description of the quantum state that best fits collected measurement.

Notably, this first step is also completed by previously introduced QST strategies. The second step completed by the tool, referred to as purity regularization, is the key novel aspect of the team's method.

A scalable and accurate tool to characterize entanglement in quantum processors
The working pipelines of QST and entanglement verification. Credit: Âé¶¹ÒùÔºical Review Letters (2025). DOI: 10.1103/qy9y-7ywp

"Our method introduces an extra guiding principle by adding the knowledge about the state's purity into the process," explained Tan.

"Generally speaking, a 'pure' quantum state is a perfectly defined and noiseless state, while a 'mixed' state is degraded by noise and thus becomes more difficult to characterize. By guiding our reconstruction towards a state with the more accurate level of purity, we can dramatically improve accuracy and avoid many of the errors that plague other methods, especially when we can't perform an exhaustive number of measurements. It's like developing a photo-editing program to do a better job of deblurring a picture."

To assess the potential of their proposed tool, Hu and Tan, along with their colleagues, implemented it on a real superconducting quantum processor that they created. The processor was configured to generate highly entangled multi-qubit states, particularly the so-called Greenberger-Horne-Zeilinger (GHZ) state, involving up to 17 qubits.

"Our method was used to reconstruct these states from and experimental results confirm that it succeeds in achieving a of 0.6817(1) for the 17-qubit GHZ state," said Hu.

"This is a remarkable result for a system of this size, where the state fidelity is an important measure of how close our reconstructed state was to the ideal target state. We believe our study contributes to the field of characterizing and benchmarking noisy quantum systems."

The team's full-state tomography characterization of a 17-qubit GHZ state is among the largest quantum state reconstructions performed to date on a hardware system. The results they attained in initial tests highlight both the accuracy and scalability of their characterization strategy.

"Our method shows superior accuracy compared to other common techniques, especially when using a limited, practical number of measurements," said Cheng.

"This is critically important, as it makes detailed characterization feasible for the larger processors currently being developed. Moreover, the results of our experiments conclusively certify the presence of genuine 17-qubit entanglement in our processor, confirming that it can reliably generate the complex quantum resources needed for powerful computations."

In the future, the approach developed by Hu, Tan, and Cheng could be used by other researchers to reliably assess the functionality of processors and characterize their underlying quantum states. This could in turn help to calibrate quantum processors and uncover issues that need to be fixed, potentially contributing to the future widespread deployment of quantum technologies.

"Our near-future goal will be to push the boundaries of scale by applying our method to even larger and more complex quantum systems," added Cheng.

"Beyond that, we plan to use this high-precision tool to analyze the states produced during the execution of different quantum algorithms, allowing us to better understand how noise impacts their performance and hence to develop more effective error mitigation strategies.

"Ultimately, our approach provides an effective pathway toward the full characterization of noisy, intermediate-scale quantum systems, which is a crucial step in the journey toward fault-tolerant quantum computing."

Written for you by our author , edited by , and fact-checked and reviewed by —this article is the result of careful human work. We rely on readers like you to keep independent science journalism alive. If this reporting matters to you, please consider a (especially monthly). You'll get an ad-free account as a thank-you.

More information: Chang-Kang Hu et al, Full Characterization of Genuine 17-qubit Entanglement on the Superconducting Processor, Âé¶¹ÒùÔºical Review Letters (2025). .

© 2025 Science X Network

Citation: A scalable and accurate tool to characterize entanglement in quantum processors (2025, September 19) retrieved 19 September 2025 from /news/2025-09-scalable-accurate-tool-characterize-entanglement.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Measuring the quantum W state: Seeing a trio of entangled photons in one go

16 shares

Feedback to editors