Why Jensen Huang and Mark Zuckerberg are Very Wrong About Quantum Computing Timelines
May 30, 2025
There is a lot of discussion about when quantum computers might become commercially viable. Today, there are over 80 different types of quantum computer makers, using multiple different schemes to create their qubits, yet no one has created a machine that can do something we can’t already do on a classical computer. Despite that, there is an overwhelming amount of hype and false claims about the capabilities of quantum computers that confuses the market. While I certainly don’t want to contribute to that hype, I have a strong belief that “useful” quantum computers are nearer than many experts predict, including Jensen Huang who was quoted earlier this year as saying, “very useful quantum computers” are 15-30 years away, and Mark Zuckerberg who reiterated that they are “decades plus out”. This post explains why they are both wrong.
To get a better intuitive sense for when quantum computers might help us in our daily lives, it’s useful to go back in time a bit. I got my first “personal computer”, an Apple IIc, when I was a senior in college, so that I could more readily write my senior thesis. During that time period, most college work needed to be manually typed in order to be submitted, and the main tool at the time was a mechanical or electric typewriter. For those too young to remember typing, it was a tedious task, and if you made an error, the methods for error correction were cumbersome (i.e., using White-out, erasable paper, mechanisms that lifted the incorrect letter or word off the paper, etc.). Even though that first computer had no hard drive, only 128k of memory, and a meager 1.4MHz processor, it had a basic word processing program, so I was able to type my thesis, proofread it, run a rudimentary spell-checker, print it and hand in a highly professional document. No email, no Internet, no mouse interface, no graphics, no nonsense.
Left image of Apple IIC, Right image of typewriter circa 1984
If the barometer of a useful computer at the time was to achieve any of those other uses, one would’ve been extremely underwhelmed, but the utility of creating a 100+ page thesis without any typos, was an amazing achievement. Clearly, at least to me, that represented “computer advantage” over the typewriter. So “advantage” is a fluid thing that may mean different things to different people at different times.
So, we need to parse a few things to level-set the quantum computing utility debate. Huang emphasized “very useful” as his target, which is a bit vague. I like the term “quantum advantage” which suggests the point when a quantum computer can add some incremental value to a problem that can’t be achieved by classical computers alone. Even a modest improvement in performance, measured by say, better weather forecasts, more timely train scheduling, a longer lasting or faster charging battery or a more effective drug molecule could each have lasting and important implications for beneficiaries of the technology. These are all types of “combinatorial” problems that quantum computers are good at. Notice I did not mention “using Shor’s algorithm to break 2048-bit RSA encryption” or other aspirational goals of quantum computing, because we don’t need that to occur in order to gain commercial advantage. I also didn’t suggest it should be the solving of some esoteric sample algorithm. Let’s just keep it very simple and say that quantum computers will become “very useful” when they can solve a commercial problem, any real-world problem, better than a classical computer can.
Now don’t get me wrong, I have massive respect for both Jensen Huang and Mark Zuckerberg for their leadership, ingenuity and ability to see the future in ways most others cannot. Jensen’s Nvidia has a $3.3 trillion (yes trillion, with a “t”) market capitalization, making it the second most valuable company on the planet. Facebook’s market cap is a not-too-shabby $1.6 trillion, which ranks it as 7th largest. Combined, the nearly $5 trillion in value that has been created under the leadership of these two entrepreneurs is bigger than every country’s GDP with the exceptions of the US and China. However, with all due respect, they have the quantum computing timeline wrong.
Quantum Computing Performance Levers
Now, let’s take a look at the quantum computing landscape from a 30,000-foot view and see if we may tease out some clues on when a quantum computer might become “very useful.” For those of you following quantum computing or beginning your journey on understanding its potential and its challenges, the number of different factors influencing the power and timing of commercial quantum benefit can be quite overwhelming. [Reader note: There is an appendix at the end of this post, which defines some of the referenced terms, for those new to quantum computing] The graphic below highlights some of the many moving pieces when describing “Quantum Volume” which is but one metric attempting to depict the relative power of a particular quantum computer.
Certainly the “number of qubits” in the top left portion of the graphic is an important feature, but so are the others…and there are additional missing factors such as whether the qubits have all-to-all connectivity or just “nearest neighbor,” whether a given quantum computer can run Clifford as well as non-Clifford gates, etc. However, the purpose of this post is not to explain all the many variables, but rather to simplify things so that we can appreciate what the timeline for commercial quantum advantage might reasonably look like. So, from that 30,000-foot vantage point, we can see three primary levers for improving the performance of a given quantum computer as follows:
Each of these is positively related to the overall performance, in other works, increases in any one of these improves the quantum computer’s capabilities. Let’s drill down now a bit, and describe how each of these levers manifests in amplifying quantum computing power:
# of Qubits: This one is straightforward, generally the more qubits a given computer has, the more powerful it is. However, there is no magic number of qubits after which advantage is achieved. Qubits come in a variety of flavors, each with tradeoffs among performance characteristics. There is also a difference between a “physical” qubit and a “logical” qubit. It is generally understood that once there are 50 or more logical qubits, that classical computers can no longer emulate the results (meaning that after 50 logical qubits, the quantum computer should be able to achieve results that cannot be achieved on a classical computer). In late 2024, Microsoft and Atom Computing announced a quantum machine featuring 24 logical qubits, and this year, Quantinuum has stated it will achieve 50 logical qubits. In 2023 there were zero logical qubits, so in a very short time, this metric has increased close to the point of commercial utility. And remember, this is just the first of three primary levers.
Gate Performance: This one is a bit more nuanced but boils down to the ability of a quantum computer to carry out commands, or “gates”. The more gates, and the deeper the circuits, the more computing power a quantum computer has. To grossly oversimplify, there are two core components that drive this: 2-qubit fidelity and coherence time. The more accurate the execution of a command, or gate, is on a pair of qubits, the more powerful the underlying quantum program or algorithm. And the longer the qubits remain in a useful quantum state, the more operations it can run. When IBM introduced its IBM-Q20 system about five years ago, its 2-qubit fidelities were 95.7%, which sounds high, but is too error-prone to do much useful computation. Five years later, with their Heron platform, 2-qubit fidelities improved to 99.8+%, or a ten-fold reduction in errors. And IBM is not unique in this sort of fidelity performance with Quantinuum recently achieving 99.914%, Oxford Ionic hitting 99.97% and IQM at 99.91%. So the elusive goal of “3-nines of fidelity” or 99.9+%, has been achieved by a growing number of quantum computing companies. As for coherence time, which is referred to as T1 or T2, IBM has enjoyed performance improvements on this metric as well, increasing it 3.45-fold over the past 5 years. So improved fidelities AND improved coherence times conspire to vastly improve the performance of quantum computers.
Error Mitigation: Another nuanced driver of performance, this comes in two broad categories of Error Suppression (optimize the circuit to minimize error potential) and Error Correction (finding and fixing the errors as they occur). Companies like Classiq and Q-CTRL (more on Q-CTRL below) operate software platforms that optimize quantum algorithms for a given quantum computer, and various other providers offer differing kinds of error correction, such as Shor Code (which uses nine physical qubits to encode one logical qubit, correcting for both bit-flip and phase-flip errors), Steane Code which uses 7:1 physical-to-logical qubits, and Surface Code, which uses a topological method on a 2D lattice of physical qubits. There are also new and emerging protocols such as qLDPC (quantum low-density parity-check), Color Codes and high-dimensional qudits, each of which holds the potential to improve error correction. Error mitigation is a crucial lever that can amplify and accelerate the performance of quantum computers, and these methodologies are rapidly improving.
The key takeaway here is that there are 3 basic levers that can be manipulated to improve the results of a given quantum computer. A few years ago, the general consensus was that in order to have a practical quantum computer, we would need 1,000,000 physical qubits to get 1,000 logical qubits (1,000:1). But a LOT has advanced since then and continues to do so at an ever-accelerating rate. With increasing numbers of physical qubits, increased qubit fidelities, and improved error mitigation strategies, it feels like we are tantalizingly close to having commercially useful quantum computers.
In fact, just this week Q-CTRL released details on a program it undertook with Network Rail and the UK Department for Transportation, utilizing C-CTRL’s Fire Opal performance optimization platform and IBM’s hardware to solve rail scheduling problems of unprecedented scale. The rail industry faces a variety of challenges including network design, train scheduling, maintenance and crew management, among other constraints. This study enabled the team to increase the real-world solvable problem size by 6-fold and indicates an accelerated timeline to practical quantum advantage by up to 3 years, now estimated by Q-CTRL to be 2028. See details of the study here.
So, with quantum computers likely to meet or exceed 50 logical qubits this year, and with ever improving gate performance and error mitigation tools, that 2028 (i.e., 3-year horizon) target noted by Q-CTRL certain feels within reach. I’m not generally a betting man, but if I had to take the over/under on Huang/Zuckerberg’s 20-year target, I’d say the Under is the safe bet.
Appendix
Quantum Advantage: When a quantum computer solves a practical problem faster than any classical computer, demonstrating superior computational power.
Physical Qubit: The basic hardware unit, akin to a Bit in classical computing, that stores quantum information and can leverage quantum properties such as superposition and entanglement.
Logical Qubit: A robust, error-protected virtualization of a qubit created by combining multiple physical qubits, reducing vulnerability to individual errors.
RSA Encryption: A widely used security method where data is encrypted with a public key and decrypted with a private key, relying on the difficulty of factoring large prime numbers. Nearly all secure communications and encoded web-traffic relies on this protocol.
Gate: A fundamental operation in quantum computing that manipulates qubit states, analogous to logic gates in classical computers.
Clifford Gate: A type of quantum gate (e.g., Hadamard, CNOT) that can be efficiently simulated on classical computers and forms the backbone of error correction.
Non-Clifford Gate: A gate (e.g., T-gate) critical for quantum advantage, enabling complex operations that classical computers struggle to simulate.
2-Qubit Fidelity: A measure (e.g., 99.8%) of how accurately a two-qubit operation (like CNOT) performs without errors. It is generally thought that 99.9+% is required for useful quantum circuits.
T1: The time a qubit takes to lose its energy and revert from the excited state to the ground state (i.e., bit flip from a 1 to a 0).
T2: The time a qubit retains its quantum coherence (phase information) before environmental noise disrupts it (i.e., phase flip)
References
“Quantum Error Correction”, quantum.microsoft.com, accessed May 29, 20205
“How to develop the Quantum Error Correction Stack for every qubit,” www.riverlane.com/blog, accessed May 29, 2025
“Streamlined Quantum Error Correction,” Physics 18, s3, January 9, 2025
“Quantinuum extends its significant lead in quantum computing, achieving historic milestones for hardware fidelity and Quantum Volume,” www.quantinuum.com, April 16, 2024
“Oxford Ionics Sets New Records for Qubit Gate Fidelities,” QuantumComputingReport.com, July 12, 2024
Wack, Andrew and McKay, David “Updating how we measure quantum quality and speed,” www.ibm.com, November 20, 2023
Certain metrics and other details were researched with the assistance of Perplexity AI