Google's WIllow chip is a big leap towards usable quantum computing but its claim of beating a classical computer by a 'septillion years' is meaningless

Meet Willow, our state-of-the-art quantum chip - YouTube Meet Willow, our state-of-the-art quantum chip - YouTube
Watch On

Quantum computers might seem like a work of science fiction but they really do exist. While they're still a long way from being used for the purposes for which they're best suited, Google's quantum computing research lab has published results on its latest chip that show the future isn't as distant as you'd think.

The quantum chip in question is called Willow (via Ars Technica) and Google likes to use some rather creative copy to describe its capabilities: "Willow performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion years — a number that vastly exceeds the age of the Universe."

That does sound mightily impressive but the benchmark in question is one that's essentially designed to showcase quantum computers in the best possible light.

Even so, what Google has done with Willow is impressive, even if one ignores the hyperbole and all comes down to something called quantum error correction (QEC). This is a collection of techniques that are used to help overcome quantum computing's problems around errors and it's generally thought that QEC will be the tool that leverages quantum computing out of the lab and into everyday reality.

The paper detailing Google's work is understandably very complex but the overall gist of it is that the researchers at Google Quantum AI discovered adding more qubits to Willow actually reduced its error rate, rather than increasing it.

This was achieved by grouping more physical qubits (the quantum computing equivalent of a digital bit) into logical qubits, making it much easier to detect, and then correct, any errors.

On face value, the numbers achieved don't seem particularly impressive—105 physical qubits in total, with an error rate of 0.143%—but this is why Google used the 'septillion years' statement mentioned above, to try and put it into some kind of context.

Something else that was an impressive achievement was the longevity of the logical qubits, with the quantum information being retained for up to an hour.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Due to the way quantum computers work, it's impossible to achieve an error rate of zero and have an infinitely sustained qubit, but neither is required to achieve a practical, fully functional quantum computer.

The errors just need to be low enough such that it's very unlikely to occur during a calculation cycle, as well as having no loss of qubits during that time.

All that is some way still off being realised but Google's work has shown that it's certainly achievable. Now it will be a matter of continuing to scale what has already been discovered with Willow, increasing the qubit count and their longevity.

From there, logic gates will be next and, ultimately, this will lead to a design for a genuine QEC processor that will ultimately be used to tackle problems that traditional supercomputers have grind to solve—particularly AI, complex simulations, and data analysis.

Just don't expect a quantum computer at home, to make you a cup of Earl Grey tea, any time soon.

TOPICS
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?