Coding (and our Modern Virtual World), explained:
0's and 1's are binary digits, or bits. Bits are parallel to the atoms in our physical world. A compilation of 0's and 1's are called binary code.
Eight bits are also known as a 'byte'. Files on your computer that are 2.1 Megabytes are compiled of 16,800,000 ones and zeros.
0's and 1's determine whether electrical circuit is on or off.
Computers only understand electricity so those who write code, in the end, are just toggling a series of on-or-off charges.
This works because those charges strung together represent logic.
Logic is a predictable series of facts or events. Computers use "Logic gates". They are composed of: AND gates & OR gates to carry out electrical charges.
The device that regulates current or voltage flow and acts as a switch or gate for electronic signals is known as a transistor. Real world example, Apple yearly adds more and more transistors to their A-series chips in their smartphones. This gives your iPhone more processing power and higher efficiency. The Apple A14 Bionic chip is packed with 11.8 billion transistors.
Let's take a logic gate that is composed of two circuits and is designed to turn a light on when electricity is passing through.
If the logic gate implemented is an AND gate, both circuits need to be closed for a light to go on.
If it is an OR gate, the light goes on if just one of the circuits is closed. This can be translated into an IF/THEN statement:
IF circuit 1 is closed, THEN light is on.
That kind of IF/THEN statement is known as an algorithm.
An algorithm is just a set of directions. As there are many different directions or paths to get to a certain destination, in coding, there can be many different algorithms to achieve the same result/solve a problem.
The goal is to to find the most elegant, efficient algorithm.
Clean, beautiful code is code that doesn't repeat itself.
Computers running algorithms just do what we tell them to do but at a higher speed than we can, enabling us to be so much more efficient. So if our code is not as clean or efficient as possible, it will slow the computer down as it executes the code.
Writing code in binary, 0's and 1's, isn't too efficient, scalable, nor practical, thus higher-abstraction languages were created such as C++, C, C#, Ruby, Java, COBAL, and Fortran. These languages boil down to the same zeros and ones written in binary, but enables the programmer to write in more plain language.
In terms of biology, 97% of our human bodies are made up of elements Carbon, Hydrogen, Nitrogen, Oxygen, Sulphur, and Phosphorus. These elements work their way up in the following order to molecules, cells, tissues, organs, organ systems, and thus you have an organism.
Recommended by LinkedIn
Coding software follows this same evolution, from binary to plain language without programmers needing to understand the binary and logic gates beneath. The same way we don't need to understand the subatomic processes our body is performing in each moment to maintain homeostasis, we just live.
The first step in coding, for us to get from binary to our modern day coding languages, was:
1. Assembly languages
It was much easier to write than binary. Instead of saying 0-1-0-0-0-1-1-1, we write "ADD". The assembler program then translates "ADD" to the equivalent binary code.
Then the second step in coding came:
2. High-level languages
Just as spoken languages are different ways of expressing the same idea to others. Programming languages are different ways to express the same idea to computers.
High level languages allowed us to write code such as (Python):
p = "Hello World!"
print(p)
This example shows how with python you can print this "Hello World!" string to terminal with just two short lines of code. In an assembly language, this would have taken around 16 lines of complicated code.
For computers to be used as frequently as they now are inventors of the past had to make them much more user friendly. They could not expect all users to write code on their computer to perform tasks such as turning it on, off, managing settings, surfing the web, or sending an email.
Our third big leap in the coding world:
3. The Graphical User Interface (or GUIs).
This allowed you and me to sit down at a screen and interact with the virtual world with a mouse and keyboard. With the introduction of GUIs, we can now code without typing at all (or we can type code more easily, such as in a coding environment, ex. VS Code). A simple click of the send button on our email application will transmit our email to the addressee.
Finally, the biggest leap to date, the Internet:
4. The Internet
Enabling us to work, shop, hang out, be entertained, date, and even eat differently, the Internet. It connected us to each other via a global network infrastructure that allows us to, for example, simply picking up our iPhones and within three clicks or gestures we can FaceTime family and friends. The Internet connected us to each other at lightning speed so we can transmit data and information from one point of the world to the opposite in a matter of seconds to minutes.
Well written article Mario!