[ad_1]

This morning, John Hennessey and David Patterson, pioneers of Google Next's computer architecture, pointed out that while this might be groundbreaking, quantum computing would still be at least a decade later. Even so, the research giant, like its hyperscale competitors, is investing in future quantum chips.
From autonomous quantum system manufacturers like D-Wave to chip builders and systems like IBM, the race to secure this distant future has not begun in earnest. For some, increasing the number of qubits is the way to show quantum supremacy, but for others like Google, there is much less question of the number of qubit and the reliable correction of errors for quantum computers tolerant to breakdowns universal.
On this fault-tolerant aspect at the International Supercomputers Conference (ISC) this year, Kevin Kissell, Google Cloud's Technical Director, described the evolution of the "Foxtail" quantum architecture to 22 qubits and the latest version of Bristlecone to 72 fifths. fault tolerance and extended fidelity on quantum simulations.
"To succeed in quantum computing, we do not only need a large number of qubits; it's necessary but not enough. What you have when you use a system can be considered as a kind of volume in space and time.When using a quantum computer, there are a number of qubits and an underlying error rate that you multiply by the number of operations perform.At a certain point, the fidelity of these operations because the limiting factor in terms of the number of successive quantum steps that can be performed. "
" We are now at a point interesting where the chips that we can build are not big enough for quantum computation fully corrected errors, the point where a logical unit has a lifetime of several hours or even days, "says Kissell. "It's still maybe hundreds or at most thousands of operations before decoherence.Quarter supremacy means performing interesting short-term applications that are too costly on clbadic machines with reliable error-correcting schemes that will work in the real world. "

In theory, this approach is rather simple. at any time in a quantum computation, the qubits are alternately mosaiced with qubits of data and measurements. Measurement qubits are there only to detect if any of the data qubits returned unexpectedly or took an alternative value, because during entanglement, this would cause other measurement qubits for the path. It is interesting to note that these data and measurement qubits can move but the measurement qubits always arrange diagonally.
Brettlecone's 72-qubit architectures, which Google researchers are still studying, rely on smaller sets of these qubits diagonally. the process of firming. "It's a new adventure every time we put together a chip," says Kissell because of the unique materials used. Although he did not provide any details on the new materials, he shared some pictures of what the Santa Barbara-based system looks like.
Google did not talk much about the stack of software to interface with their quantum simulator on the Google Cloud or with the current system in Santa Barbara, but Kissell said that it's not the same. they were impressed by efforts like ProjectQ even though they took their own way.
As a secondary note, for those who are interested in the idea of the 50 qubit limit for existing calculations and fault tolerance on quantum systems in general, we recommend this excellent John paper Preskill at CalTech
Sign up for our Newsletter
Featuring highlights of the week straight from us to your inbox with nothing in between.
Subscribe now
Source link