Physics

The study reveals the limits to how much quantum defects can be ‘changed’ in large-scale systems

The study reveals the limits to how much quantum defects can be 'changed' in large-scale systems

Credit: Quek et al.

Quantum computers have the potential to outperform conventional computers in some problems related to information processing, perhaps even in machine learning and processing. However, their great performance has not been possible, mainly because of their sensitivity to noise, which causes them to make mistakes.

One method designed to handle these errors is known as quantum error correction and is designed to work “on-the-fly,” checking for errors and restoring calculations when errors occur. Although there has been significant progress in recent months on these issues, this strategy is still very complex and comes with many tools.

Another approach, known as quantum error minimization, works more indirectly: instead of correcting errors as they occur, a computer filled with errors (or modified versions of them) continues first to last. Only in the end, one goes back to find out what the correct result was. This method was introduced as a “stand-in” solution for dealing with errors made by quantum computers before full error correction can be implemented.

However, researchers at the Massachusetts Institute of Technology, the Ecole Normale Superieure in Lyon, the University of Virginia and the Freie Universität Berlin have shown that methods to reduce quantum errors become less effective as quantum computers become more sophisticated. and they increase.

This implies that error reduction will not be a long-term solution for the constant noise problem of quantum computation. Their paper, published in Natural Physicsprovides guidance on which schemes to reduce the negative effect of noise in quantum computations should not work.

“We were thinking about the near-term computing limitations of using noisy quantum gates,” Jens Eisert, co-author of the paper, told Phys.org.

“Our colleague Daniel Stilck França had recently proved a result that was equivalent to the strong limitations of a near-quantum computer. We had recently thought about the reduction of quantum errors, but we thought : wait, what does that mean for quantum error reduction?”

The latest paper by Yihui Quek, Daniel Stilck França, Sumeet Khatri, Johannes Jakob Meyer and Jens Eisert builds on this research question, aiming to explore the optimal limits of reducing quantum errors. Their findings reveal how quantum error reduction can help reduce the impact of noise on short-term quantum computing.

“Quantum error reduction was intended to be a substitute for quantum error correction as it required that precision engineering be used, so there was hope that it could be achieved, even for of the current experimental capacity,” Yihui Quek, lead author. of the paper, said Phys.org.

“But when we looked at these relatively simple mitigation schemes, we started to realize that you probably can’t have your cake and eat it – yes, it requires fewer qubits and control, but that often comes at the cost of driving the entire system worryingly many times over.”

Another example of a reduction strategy that the team found to have limitations is known as ‘zero-error extrapolation’. condition.

“Basically, to combat noise, you have to add noise to your system,” Quek explained. “Even in the abstract, it is clear that this cannot be broken.”

Quantum circuits (i.e., quantum processors) consist of layer upon layer of quantum gates, each of which is fed the calculations of the previous layer and advances them. first. If the gates are noisy, however, each part of the circuit becomes a double-edged sword, since the more it reads, the more errors the gate itself produces.

“This creates a serious dilemma: you need a lot of gate units (so it’s a deep circuit) in order to do trivial calculations,” Quek said.

“However, a deep cycle is also noisy—it’s likely to generate spurious objects. So, there’s a race between the speed at which you can integrate and the speed at which statistical errors accumulate .

“Our work shows that there are vicious cycles where the latter are much faster than previously thought; so to reduce these vicious cycles, you will need to move them many times. a special algorithm that you use to reduce errors.”

A recent study by Quek, Eisert and their colleagues suggests that the reduction of quantum errors is not as dangerous as some have predicted. In fact, the team found that as quantum circuits increase, the effort or resources required to minimize errors increases significantly.

“Like all theorems that don’t move, we like to see them less as a show stopper than an invitation,” Eisert said.

“Perhaps working with many geometries that are locally connected, one arrives at optimistic levels, where perhaps our limit is pessimistic. Common structures often have such local relationships. The study of ours can also be seen as an invitation to think more about quantum error reduction projects.”

The findings gathered by this research group can serve as a guide for physicists and quantum engineers around the world, encouraging them to design other and more effective schemes to reduce quantum errors. In addition, they can recommend other studies that focus on the theoretical aspects of passive quantum circuits.

“Previous work on quantum error reduction algorithms suggested that these schemes would be indestructible,” Quek said.

“We came up with a framework that takes a large part of these individual algorithms. This allowed us to argue that this dysfunction that others have seen is related to the concept of reduce the quantum error itself—and have nothing to do with any particular implementation.

“This was made possible by the mathematical mechanics we developed, which are the most powerful results known so far on how quickly circuits can lose quantum information due to physical noise. “

In the future, the paper by Quek, Eisert and their colleagues may help researchers quickly identify types of quantum error reduction schemes that may not work. The main meaning of the group’s results is to clarify the idea that long-range gates (ie, gates with qubits separated by a large distance) can be both useful and problematic, as they introduce confused, to develop a computer, as they spread. noise in the system quickly.

“This, of course, opens the question of whether it is possible to obtain quantum benefits without using quantum super-spreaders and their worst enemy (ie, noise),” Quek said. add. “Importantly, all of our results don’t hold up when new qubits are generated in the middle of calculations, so a certain number of them may be needed.”

In their next studies, the researchers plan to shift the focus from the issues they have identified to potential solutions to overcome these issues. Some of their colleagues have already made some progress in this direction, using a combination of benchmarking techniques and quantum error mitigation techniques.

Additional information:
Yihui Quek et al, Strongest bounds for quantum error reduction, Natural Physics (2024). DOI: 10.1038/s41567-024-02536-7.

© 2024 Science X Network

Excerpt: Study reveals limits to how quantum defects can be ‘awakened’ in large-scale systems (2024, August 11) retrieved on August 11, 2024 from https://phys.org/news/2024- 08-unveils-limits-extent-quantum-errors.html

This document is subject to copyright. Except for any legitimate activity for the purpose of private study or research, no part may be reproduced without written permission. Content is provided for informational purposes only.


#study #reveals #limits #quantum #defects #changed #largescale #systems

Leave a Reply

Your email address will not be published. Required fields are marked *