Bit vs Qubit - The impact of Quantum Computing on the Future of Algorithmic Power
Part 1 How a computer works ? - Bit computation
Is everything about electricity ?
Yes ! Electricity 🔌 is the basic energy source for computers 💻. Without electricity Today's world is sinking into Chaos 💣!
At the heart of a computer is the microprocessor (also known as the Central Processing Unit (CPU) ), a gigantic circuit made up of billions of tiny elements called transistors. Transistors ?! Does it ring a bell? Yes, exactly! It comes from electricity, a device made up of semiconductors used to control the intensity of the electric current or its voltage or used for switching of an electronic signal ⚡️.
In computers, a transistor measures just ten nanometers, or about 0.000001 cm ! (And the future is promising).
Logic gates are the building blocks of computer logic
In CPU architecture (Today's computers), transistors are used as the building blocks for creating logic gates and memory cells.
When we talk about logic, it's not really logic. It's what we call Boolean logic : YES or NO and this logic can be used with 3 main operators: AND - NOT - OR
There are also other logic gates (XOR, NOR, NAND, buffer) which I will not cover in this article.
(for more information Basic Logic Gates - The Education basic of Honk Kong or Logic Gates Boston University)
These conditions are represented by the presence (5 volt 💥) or absence of an electric current and will be expressed as 1 or 0.
Based on this principle, we can perform all the calculations we want by linking the transistors together (A bit like a decision tree).
The number of transistors in a computer is one factor that can influence its processing power
Binary numbers : the calculation
In binary system you cannot have another number than 1 or 0. but... any numbers can be converted as binary number and the rules are the same as the one when calculating with decimals.
To convert any number as decimal ? Simple ! 📢
Let's take the example of number 74 :
For more information on binary numbers, read the following article : Computer and concepts technology, Thomas E. Beach, Ph.D.University of New Mexico - Los Alamos
Recommended by LinkedIn
For calculation it's "straightforward !"
If you think about it... a computer, a smartphone, or any other form of device that has to apply any form of calculation (simply running software) is just a question of 1 or 0 !
And for the sake of completeness, it is important to know that the results of calculations can be stored - and then reused by the system - in what is known as memory.
In memory storage 💾, we talk about bytes, not bits. In fact, one byte can store 8 bits. That's why memory capacity is a multiple of 8 ! And don't forget that when we talk about broadband (download speed), we're talking in bits again! So if you download a 50 MB (megabyte) file at 50mbps (megabit/second) it will take you 8 seconds!
Part 2 How a quantum computer works ? - Qubit computation
Qubit : the revolution
In conventional computing, we can only have two states: 1 or 0. In the world of quantum computing, a qubit do not have just two states (1 or 0), and only one at a time. It can be in both states simultaneously. This is why it is called a "superposition" state, because a Qubit can be zero and one at the same time or any proportion of 0 and 1 in the superposition of the two states.
To illustrate the Qubit, we often use the state of a cat 🐱 (yes, a cat !) We owe this to the physicist Schrödinger, who, on the basis of his work, demonstrated a certain problem in quantum theory that allows a cat to have several states: to be dead 💀🙀 , to be alive 😼 and to be both dead and alive 👻 !
This principle of superposition and parallel calculations opens up possibilities that are still unimaginable in the IT and AI fields.
The qubit opens the way to much more powerful and faster computer processing, thanks to the parallelization of calculations. This principle of parallelization means that all the calculations can be done at the same time 🚀 !
Nowadays, everyone talks about artificial intelligence, but we mustn't forget that for a parametric algorithm - we're talking about a neural network - a neuron is no less a linear regression or logistic regression. (For more information, see my article : The revolution of AI projects integrated with blockchain technology)
Imagine the repercussions on AI algorithms if we had computers capable of performing all the calculations in parallel during the learning phases ? At last we could really use the term artificial intelligence 🤖 in its true meaning
It's all very promising, but what are we waiting for 💫 ?
There are two current problems :
Where do we stand today ?
I've mentioned a few major advances above, but it's important to note that there have been many other incredible advances in this field in recent years, whether by specialist tech start-ups or spin-offs 🎓 (start-ups launched by university programs).
Although the market talks about artificial intelligence as being revolutionary, most people misunderstand it. It's not the algorithms that will evolve, but the machines that apply those algorithms. And that's what will revolutionize the world of tomorrow !
Great article! I love the use of Shrödinger’s theory to illustrate the perspective of multiple and simultaneous parallels 🙌🏽