Dismanteling the quantum computing hype
When approaching the topic of quantum computing for the first time, one is often confronted with numerous misunderstandings and false narratives, including here on LinkedIn, which give the inexperienced layperson a completely wrong impression of the subject. After four years of intensive study of quantum computing, I would like to clear up the worst misconceptions at this point.
In quantum computing, we distinguish between quantum annealing, based on adiabatic quantum computing, and quantum gate computing. While the quantum annealer is designed to solve only very specific problems (quadratic unconstrained binary optimization), the quantum gate computer is a universal computer on which, in principle, all algorithms can be calculated that can also be calculated on a classical computer. Without going into further details, I think that quantum annealing is no longer an issue even for the most enthusiastic users. In my opinion, D-Wave (the world's first commercially available quantum computer) has also reached a dead end and is now changing horses. In the following, we will therefore restrict the discussion to the universal and much more interesting quantum gate computer.
The quantum computer is a very fast device (false)
In fact, the quantum computer (QC) is a very slow computer. A future, fault-tolerant QC will optimistically have an I/O bandwidth rate that is 10000 times smaller than for an existing classical chip. The operation throughput will even be 10-11 orders of magnitude smaller than that of today's classical chips (T. Hoefler, T. Häner, M. Troyer: Disentangling Hype from Practicality: On Realistically Achieving Quantum Advantage). This definitely prohibits applications on mass data (weather data, exponentially large matrices, Machine Learning, etc.), real-time applications (vehicle routing) and quantum algorithms that only have a polynomial speedup (e.g. Grover search).
The quantum computer can store and process a huge amount of data (false)
Just as the QC cannot process a huge amount of data due to its inherent slowness (see above), it cannot store it either. The fact that a QC with n qubits can be brought into a quantum state, which corresponds to a superposition of 2 to the power of n classical states and whose description requires just as many complex numbers, does not mean that one also has access to this "information". If you try to extract information, all you get is an n-bit string and the breakdown of the quantum state (collapse of the wave function). If a 1000-qubit QC could actually store the information of 2 to the power of 1000 bits, it would inevitably collapse into a black hole due to the Bekenstein bound. That would be a pity for the quantum computer, but it won't happen.
The quantum computer can try out exponentially many possibilities at the same time and therefore find a solution much faster than a classical computer (false)
That would be nice, but it is the misconception of misconceptions. In order to make one of exponentially many possible outcomes measurable in the end, its exponentially small probability amplitude must also be exponentially amplified. And this generally also requires an exponential number of gate operations on the quantum state. Shor's quantum algorithm for the factorization of prime numbers does not try out all possible factors, but uses classical number-theoretical findings in combination with the Quantum Fourier transformation (QFT). Shor's crucial insight in 1994 was that the discrete Fourier transform can be evaluated in polynomial time on a quantum computer and thus has an exponential quantum speedup.
There will be numerous possible applications for the quantum computer (rather not)
Whereas all practically solvable problems for a classical computer are in P (polynomial time), they are in BQP (bounded-error quantum polynomial time) for a quantum computer. The figure below shows the assumed relationship between BQP and the complexity classes P, NP and PSPACE. The essential point is that NP is not believed to be contained in BQP. In other words, if an NP complete problem is in BQP, all NP problems are in BQP, since all NP problems are effectively reducible to one of the NP complete problems. However, should such a quantum algorithm be found after more than 30 years, this would indeed be a world sensation, similar to find a proof that P = NP. However, there is not the slightest reason for this hope.
But it is also the case that the vast majority of all problems in NP are also NP complete, especially those that people are interested in and that are also interesting for industrial applications! This includes all combinatorial optimizations such as scheduling (logistics), knapsack problem (finance, portfolio), hamilton cycle (TSP), etc.. As far as we know today, none of these problems can be solved efficiently on a quantum computer!
Here is a quote from Scott Aaronson: "Claims that we know how to get near term speedups for optimization, machine learning, etc. are > 95% bullshit." Scott Aaronson is one of the world's foremost experts on the capabilities and limits of quantum computers and computational complexity theory in general. He is a Professor of Computer Science at the University of Texas at Austin, Director of its Quantum Information Center, and author of the widely read blog Shtetl Optimized. And 95% is an absolute lower bound, as he says!
If you want to know whether a future quantum computer could help you to solve a difficult problem, the question is not how you can translate the problem into a corresponding quantum algorithm, but whether one of the few, efficient and known quantum algorithm primitives can make a contribution in solving the problem. A very good and comprehensive overview can be found in Quantum algorithms: A survey of applications and end-to-end complexities. If one takes into account all the caveats and limitations presented there (essentially read in and read out effort, resources required), almost every quantum advantage of algorithms apart from quantum simulations of compact problems breaks down.
Matthias Troyer, former Professor of Computational Physics at ETH Zurich for almost 20 years and now Technial Fellow and CVP at MS, also appeals to the community to finally focus on the essentials: "Therefore, I'm calling on this community to double down on learning about and researching applications related to computational chemistry and material science."
More than approx. 50 qubits can no longer be effectively simulated on a classical computer (it depends)
A few years ago, it was expected that the first quantum advantages would be achieved with quantum computers of more than 100 qubits. We are now far beyond that. At the end of 2023, IBM launched a 1,121 superconducting qubit quantum processor. And in June 2023, researchers from IBM and UC Berkeley published a paper Evidence for the utility of quantum computing before fault tolerance. The researchers believed they had already found a use case for quantum computing on a 127 noisy qubit processor and 3000 gates using classical error mitigation. However, this evidence has already been refuted within two weeks by three independent papers, all of which showed that this problem can indeed be simulated classically, faster and with greater accuracy.
Recommended by LinkedIn
In fact, 1000 qubits can easily be simulated on a notebook (using tensor networks and matrix product states) as long as they are not too entangled, which essentially means that the quantum circuit is not too deep. But this is precisely the case with the current, non-error-corrected quantum computers of the NISQ era (Noisy Intermediate Scale Quantum).
For the experienced reader I can recommend the jupyter notebook Understanding efficient simulation of NISQ quantum computers with matrix product states. From the summary of the notebook: Simulations using tensor networks and matrix product states clarify the line between easy and hard. And it is hard to imagine that shallow NISQ quantum computers can cross this line, especially since there is always the possibility not to compute with full bond dimension or even to lift the tensor networks into two or even three dimensions.
In this context, one can certainly ask to what extent it makes sense to constantly increase the number of qubits without significantly improving their quality in order to enter the area of quantum error correction, which is a prerequisite for a scaling quantum computer and thus also for any relevant application. Unfortunately, we have not seen any progress in the area of quantum volume of superconduting qubits since May 2022.
Quantum computing is disruptive and will change the world (not very likely)
Even if you read this in almost every presentation, this will probably not be the case. If a scalable quantum computer can be realized at all, it will not find its way into every field of application that we are told about for the reasons mentioned above. It will most likely be used for what Richard Feynmann actually proposed it for in his famous keynote speech at the 1st Conference of Physics and Computation at MIT's Endicott House in the 1980s, that is, the simulation of quantum mechanical systems. And there will probably only be a small number of experts working on this, namely those who, for example, are able to understand Hartree-Fock and density functional methods and apply them on a computer even today. But it will certainly not be the railroad or bank employee, nor the IT specialist.
You might ask yourself whether what you read here isn't all wrong and whether what you can read in so many other places isn't right after all. So many people simply can't be wrong. Yes, they can!
When it comes to hype, especially hype that is supposed to change the economy, it is the case that someone in every organization (political, economic, scientific, pseudo-scientific) feels compelled to comment on this topic. But this is a topic that cannot be covered in a few hours. It requires in-depth knowledge in the fields of physics, mathematics and computer science, which can rarely be assumed to the extent required. Even a trained physicist is not per se in a position to make a profound statement on the subject without familiarizing himself with the subject in detail. A current example of this is Michio Kaku, theoretical physicist, string theorist and author of numerous popular science books, most recently: Quantum Supremacy How the Quantum Computer Revolution Will Change Everything (don't buy - don't read!). Scott Aaronson on this: "So I can now state with confidence: beating out a crowded field, this is the worst book about quantum computing, for some definition of the word "about," that I've ever encountered. ... Kaku appears to have had zero prior engagement with quantum computing, and also to have consulted zero relevant experts who could've fixed his misconceptions."
Since not everyone has a PhD in theoretical physics like Michio Kaku and this is obviously not sufficient to make profound statements on the subject, it is clear that the majority of authors look up what others are writing. And so the misconceptions become so widespread that even a conversation with ChatGPT on this topic is really no fun anymore!
So before you invest millions of euros or dollars in the subject in good faith, it's better to consult one or more independent experts first. Consulting the Internet or trusting that all the others can't all be wrong is not a good idea! You can make a fool of yourself if you announce that route and traffic management will be carried out by a quantum computer ...
Questions, comments and controversial perspectives are most welcome
#quantumcomputing
You mentioned the misconceptions surrounding the potential of quantum computing, and it is true that there is often exaggerated speculation in this area. However, it is important to note that the concept of exaggerated expectations is not limited to quantum computing alone. Throughout history, new technologies have faced similar skepticism and yet have managed to exceed expectations. For instance, when the first computers were invented, many believed they would have limited use in everyday life. Nevertheless, computers have revolutionized numerous industries and transformed the way we live and work. Therefore, while it is essential to address misunderstandings, it is also crucial to remain open to the possibilities that quantum computing may bring. In light of this, I am curious to know if any developments or breakthroughs in related fields such as quantum algorithms or error correction have caught your attention recently?