Background of Quantum Computing

Background of Quantum Computing

Introduction

I think quantum computing will be the next huge leap, similar to how AI is today. However, the concept of quantum computing is older than we think. In this paper, I discuss the evolution of quantum computing up to the present. The data I present has been gathered through my research, and I would like to share it with those who want to get an idea about the beginnings of quantum computing.

Beginning of Quantum Computer Concept

The concept of quantum computing can be mentioned as one of the major revolutionary technologies of the last decade in the computer science landscape. However, that concept was initiated in early 80s, with Richard Feynman's paper "Simulating Physics with Computers" published in 1982 in the International Journal of Theoretical Physics. Feynman [1] recognized the simulation of quantum mechanical systems using classical computers cannot be realistically achieved. Feynman explained that a full description of a quantum system containing R particles is represented by a function ψ (x₁, x₂, ..., xᵣ, t), which indicates the probability amplitude of finding the particles at positions x₁ to xᵣ at a given time t. Since it has massive number of variables involved to that calculation, simulating such a system on a classical computer, with a number of elements proportional to R or proportional to N is not possible. Thus, the computational demand increases exponentially with the size of the system. Even relatively moderately sized quantum systems require an overwhelming number of resources to simulate accurately using traditional computing.

Instead of trying to force classical computers to simulate quantum systems, Feynman proposed to give up on the existing rule about what the computer was, and he suggested letting the computer itself be built of quantum mechanical elements that behave according to quantum mechanical laws [1]. It can be considered as the birth of the quantum computer concept which use quantum mechanics to compute quantum mechanics.

With the proposal of quantum computer called as “Universal Quantum Simulator Design” (1982), Feynman introduced a specific architecture for quantum computers that was capable of simulating any quantum physical system. Feynman proposed based on simple two-state systems. Assume that every finite quantum mechanical system can be described exactly, imitated exactly, by imagining another system where, at every point in space-time, there are only two fundamental states which are the point is either occupied or unoccupied [1].

This idea suggests that any quantum system can be modeled using very simple building blocks where in each point in space-time can be either occupied or unoccupied [1]. These two states are similar to the 0 and 1 in classical computing. But in quantum computing, they represent the basic units of quantum information called as qubits.

However, qubits can represent much more information to quantum properties like superposition and entanglement. This idea helps to understand how to build quantum computers using simple, scalable components.

Feynman also developed the mathematical framework using creation and annihilation operators, establishing the foundation for what would become quantum gate operations [1]. He further explained the equation Si = Fi (Sj, Sk, ...) and described how the quantum state at any point in space-time depends only on the states of its immediate neighbors, not on distant points [1]. Modern quantum computers use quantum gates that operate on just a few nearby qubits. Each gate operation follows this principle, and the new state depends only on neighboring qubits.

Article content
How the quantum state at any point in space-time depends on the states of its immediate neighbors, not a distant point. [1]

Computational Representation for Quantum Computer Concepts

Three years after Feynman's seminal paper, David Deutsch provided the crucial theoretical formalization that transformed Feynman's ideas into a rigorous computational model in "Quantum Theory, the Church-Turing Principle and the Universal Quantum Computer" in 1985. In that paper Deutsch [2] emphasized that every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means. That extended the classical Church-Turing principle to the quantum realm. According to Church-Turing principle, anything that can be computed by a Turing machine, which is the basic model of a classical computer in computer science.

Deutsch provided a precise mathematical description of how quantum computers should be structured. In his view, similar to a Turing machine, a model quantum computer consists of two components, a finite processor and an infinite memory, although only a finite part of the memory is actually used during any computation [2]. This can be identified as the proper structuring of a quantum computer based on fundamental computational theories in computer science.

In Deutch’s processor design, the processor consists of M 2-state observables, { 𝑛̂𝑖 } (𝑖 ∈ ℤ𝑚) [2]

The processor has M qubits, each qubit can be in 2 states, |0⟩ or |1⟩. This is similar to classical 0 or 1 but can also be in superposition.

Where ℤ𝑚 is the set of integers from 0 to M-1, M is a fixed, finite number and 𝑛̂𝑖 represents the i'th qubit in the processor [2].

So, the qubits are labeled from 0 to M-1 and have finite number of states. In here in the memory design, the memory consists of an infinite sequence [2], {m ̂ ᵢ} (i ∈ ℤ) of 2 states observables.

The unlimited memory made of qubits is like an infinite tape in classical Turing machines. Each memory location m ̂ ᵢ is also 2 states qubits. The tape is infinitely long and needs to move during computation for that system that moves the tape by sending signals between neighboring segments at a finite speed between neighboring sections would fulfill the ‘finite means’ condition. It would be adequate for carrying out the required operations [2].

Modern computers use same principle as Deutsch described, using local and finite interactions to achieve global quantum computation. Qubits are physically arranged so they can only directly interact with their immediate neighbors. When quantum information needs to travel across the quantum processor, instead of jumping instantly across long distances, it hops step by step from one qubit to the next through a chain of local operations [2]. This approach makes quantum computers scalable and practically can build because each qubit only needs a few physical connections to its neighbors. This solves the problem requiring exponentially growing numbers of connections as the system gets larger.


Industry-Wide Quantum Computing Development Timeline

In the past decade, quantum computing landscape has achieved a remarkable progress. This improvement can be identified as the transition of quantum computing from experimental concepts to practical quantum systems. The evolution has been characterized by systematic improvements in qubit count, coherence times, error rates, and computational capabilities.

The modern era of publicly accessible quantum computing began when in 2016 IBM Quantum has achieved a milestone providing the world's first accessible quantum computer. They achieved this through advancing both hardware and software and let access to quantum hardware via IBM Cloud® [5]. After that researchers are able to access open quantum computing features. This can be introduced as the beginning of NISQ (Noisy Intermediate-Scale Quantum) era.

Following IBM's pioneering cloud access, it causes the expand of the quantum computing ecosystem rapidly. Companies such as IBM (2016), Rigetti Computing (2017), IonQ (2020), Honeywell (2020), Google (2020), Xanadu (2020), Oxford Quantum Circuits OQC (2021), PASQAL (2022), QuEra (2022), and Quandela (2022) have provided access to their quantum system via cloud platform [5]. As a result, research and development has been accelerated across the field.

IBM developed their first quantum processor with the Canary family in 2017. The original r1 processor in that series design introduced in 2017 January with 5 qubits, integrating resonators and qubits on a single lithography layer [5]. The design r1.1 was released in 2017, may expand the processor's capacity to 16 qubits [5].

In 2019 Google first demonstrated 'quantum supremacy'. Quantum supremacy refers to the point at which a quantum computer can perform a task that is practically impossible for a classical computer to complete in a reasonable amount of time. This is considered as the key milestone in quantum computing in researching fields like experimental findings of quantum error-correction and the demonstration of entanglement [6]. In that demonstration, Google researches use Sycamore quantum processor having 53 qubits. However worth to note that apart of the number of qubits, quantum property of qubits and noise of channel affect the correctness of the results [7].

Pan et al. developed “Jiu Zhang” quantum computer with 76 photons. They executed Gaussian boson sampling (GBS) and proved that Jiu Zhang quantum computer is 100 trillion times faster than the Taihu Light supercomputers which were the world’s most powerful 1st ranked computers in 2016 to 2017 [8].

In 2020 IBM developed Falcon family and it can be considered as a medium scaled circuit which has 128 QV (quantum volume) [5]. Quantum volume serves as a key performance metrics incorporating multiple factors including qubit count, connectivity, and error rates. Thus, Falcon family is the significant advancement in computational capability.

Simultaneously, IBM developed the Hummingbird series in 2019 October. In that series they tried to support a large number (more than 50) of qubits on a single chip [5]. The team from University of Science and Technology of China developed a superconducting processor called ‘Zuchongzhi’ which has 66 qubits. They run random line sampling and they expected to be 2 to 3 orders of magnitude than the Google’s Sycamore processor which has 53 qubits [7].

In 2021 IBM launched 127 qubit chips named as “Eagle”. After that IBM released “Osprey” processor in November 2022 with dramatical enhancement of qubit count with 433 qubit processor [5]. That can give good demonstration of rapid scalability improvements in quantum computing.

In next year 2023, IBM has introduced Condor Processor that has 1,121 superconducting qubits arranged in a honeycomb configuration and it leverages Quantum's cross-resonance gate technology [5]. In that honeycomb configuration, qubits are geometrically arranged in a hexagonal lattice pattern, similar to a honeycomb structure. This layout optimizes qubit connectivity and spacing while maintaining efficient control and readout capabilities across the large 1,121 qubit processor.

Also, worth to mention that new technologies such as Cross-resonance gate technology, also developed to making a reliable foundation for scaling up quantum processors. In Cross-resonance gate technology, IBM developed method for creating precise two-qubit quantum gates between superconducting qubits that operate at fixed frequencies. The real advantage of this is, its simplicity in implementation and natural resistance to noise [5].

When analyzing the striges of researchers to improving quantum computing, now they departure from previous strategy of rapidly multiplying qubit counts annually. Instead of that, they prioritize enhanced error resistance over further qubit scalability. As an evidence IBM’s Heron chip (2023) which has only 133 qubits, achieves a record-low error rate that is three times lower than its previous quantum processor [5].

This progression represents an unprecedented rate of technological advancement in the field of quantum computing and is currently a major area of active research among leading tech companies.

Article content
IBM quantum cloud's performance metrics. (Avg mean average and N/A mean not applicable) [5]

References

[1] R. P. Feynman, "Simulating Physics with Computers," International Journal of Theoretical Physics, vol. 21, no. 6-7, pp. 467–488, 1982.

[2] D. Deutsch, “Quantum Theory, The Church-Turing Principle and The Universal Quantum Computer,” Proc R Soc Lond A Math Phys Sci, vol. 400, no. 1818, pp. 97 117, Jul. 1985, doi: 10.1098/rspa.1985.0070.

[3] L. M. Brown, “Paul A.M. Dirac’s the Principles of Quantum Mechanics,” Phys Perspect, vol. 8, no. 4, pp. 381–407, Dec. 2006, doi: 10.1007/s00016-006-0276-4.

[4] D. Deutsch and R. Jozsa, “Rapid Solution of Problems by Quantum Computation,” Proc R Soc Lond A Math Phys Sci, vol. 439, no. 1907, pp. 553–558, Dec. 1992, doi: 10.1098/rspa.1992.0167.

[5] M. Abu Ghanem, “IBM Quantum Computers: Evolution, Performance, and Future Directions,” Sep. 2024, doi: 10.1007/s11227-025-07047-7.

To view or add a comment, sign in

More articles by Bulitha Kawushika

Others also viewed

Explore content categories