Scientists have just simulated quantum technology on classical computer hardware



[ad_1]

Lurking in the background of the quest for true quantum supremacy lurks an embarrassing possibility – hyper-fast number calculating tasks based on quantum deception just might be a load of hype.

Today, two physicists from the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland and Columbia University in the United States have found a better way to assess the potential of quantum devices in the short term – by simulating the quantum mechanics on which they are based. more traditional material.

Their study used a neural network developed by Giuseppe Carleo of EPFL and his colleague Matthias Troyer in 2016, using machine learning to provide an approximation of a quantum system responsible for executing a specific process.

Known as the approximate quantum optimization (QAOA) algorithm, the process identifies optimal solutions to a problem on energy states from a list of possibilities, solutions that should produce the fewest errors. when applied.

“There is a lot of interest in understanding what problems can be solved efficiently by a quantum computer, and QAOA is one of the more prominent candidates,” Carleo said.

The QAOA simulation developed by Carleo and Matija Medvidović, a graduate student at Columbia University, mimicked a 54-qubit device – sized, but well in line with the latest achievements in quantum technology.

While this is an approximation of how the algorithm would work on a real quantum computer, it did a pretty good job of serving real business.

Time will tell whether future physicists will quickly determine ground states in an afternoon of QAOA calculations on a bona fide machine, or take their time using proven binary code.

Engineers are still making incredible progress in harnessing the spinning wheel of probability trapped in quantum dots. The pressing question is whether current innovations will one day be enough to overcome the biggest hurdles in this generation’s attempt at quantum technology.

At the heart of every quantum processor are units of computation called qubits. Each represents a wave of probability, one without a single state defined but which is robustly captured by a relatively simple equation.

Connect enough qubits – this is called entanglement – and this equation becomes more and more complex.

As the number of linked qubits increases from tens to tens to thousands, the types of computations its waves can represent will leave whatever we can handle using conventional binary code bits in the dust. .

But the whole process is like weaving a lace carpet out of cobweb: each wave is one breath away from entanglement with its surroundings, resulting in catastrophic errors. While we can reduce the risk of such errors, there is currently no easy way to eliminate them completely.

However, maybe we could live with the mistakes if there is an easy way to make up for them. For now, the expected quantum acceleration risks being a mirage that physicists desperately pursue.

“But the barrier of ‘quantum acceleration’ is anything but rigid and is continually being reshaped by new research, also thanks to advances in the development of more efficient classical algorithms,” explains Carleo.

As tempting as it may be to use simulations as a way to argue that classical computing retains an advantage over quantum machines, Carleo and Medvidović insist that the ultimate advantage of approximation is to establish benchmarks in what could be achieved in the current era of new imperfect quantum technology. technologies.

Beyond that, who knows? Quantum technology is already a sufficient bet. So far, this is the one that seems to be paying off well.

This research was published in Quantum information about nature.

[ad_2]

Source link