/Progress in Algorithms Makes Small, Noisy Quantum Computers Viable

Progress in Algorithms Makes Small, Noisy Quantum Computers Viable

Summary: Work by Los Alamos and others shows that hybrid quantum/classical algorithms can accommodate limited qubits and lack of error correction for real-world tasks

Original author and publication date: Los Alamos National Laboratory – August 13, 2021

Futurizonte Editor’s Note: Small quantum computers are viable. Does this mean that the artificial brain is also viable?

From the article:

As reported in a new article in Nature Reviews Physics, instead of waiting for fully mature quantum computers to emerge, Los Alamos National Laboratory and other leading institutions have developed hybrid classical/quantum algorithms to extract the most performance—and potentially quantum advantage—from today’s noisy, error-prone hardware. Known as variational quantum algorithms, they use the quantum boxes to manipulate quantum systems while shifting much of the work load to classical computers to let them do what they currently do best: solve optimization problems.

“Quantum computers have the promise to outperform classical computers for certain tasks, but on currently available quantum hardware they can’t run long algorithms. They have too much noise as they interact with environment, which corrupts the information being processed,” said Marco Cerezo, a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos and a lead author of the paper.

“With variational quantum algorithms, we get the best of both worlds. We can harness the power of quantum computers for tasks that classical computers can’t do easily, then use classical computers to compliment the computational power of quantum devices.”

Current noisy, intermediate scale quantum computers have between 50 and 100 qubits, lose their “quantumness” quickly, and lack error correction, which requires more qubits. Since the late 1990s, however, theoreticians have been developing algorithms designed to run on an idealized large, error-correcting, fault tolerant quantum computer.

“We can’t implement these algorithms yet because they give nonsense results or they require too many qubits. So people realized we needed an approach that adapts to the constraints of the hardware we have—an optimization problem,” said Patrick Coles, a theoretical physicist developing algorithms at Los Alamos and the senior lead author of the paper.

“We found we could turn all the problems of interest into optimization problems, potentially with quantum advantage, meaning the quantum computer beats a classical computer at the task,” Coles said.

Those problems include simulations for material science and quantum chemistry, factoring numbers, big-data analysis, and virtually every application that has been proposed for quantum computers.

The algorithms are called variational because the optimization process varies the algorithm on the fly, as a kind of machine learning. It changes parameters and logic gates to minimize a cost function, which is a mathematical expression that measures how well the algorithm has performed the task. The problem is solved when the cost function reaches its lowest possible value.

READ the full article here