We've Experienced A Programming Paradigm Shift
I'm in the kitchen at a house party, having polite conversation. A programmer friend enters the room, and much to the chagrin of bystanders, we begin a debate about which is better: Assembly language or Java.
DO NOT debate computer languages at parties
Normally I'm a rational person and wouldn't have debated the merits of programming languages, but I had just reviewed the TIOBE index of computer programming languages and was excited. Assembly language had rejoined the ranks of the top-ten most popular computer languages. I think this is very cool news. Maybe you don't, but maybe you should.
Look behind the data
News has three dimensions: the immediate facts, the backstory, and the big impact. Think about an apple falling to the ground. The immediate fact is that an apple fell off a tree (big deal). The backstory is that Sir Isaac Newton watched it fall. The big impact is that an apple helped formulate Newton's Theory of Gravity.
Assembly language ranked in the top ten list of computer languages is an immediate fact. Yep - it's there, coming in at number ten, right behind (1) Java, (2) C, (3) C++, (4) Python, (5) C#, (6) PHP, (7) JavaScript, (8) Visual Basic .net and (9) Perl. A boring fact, until you consider the backstory.
Assembly Language is for speed and size
Adding two numbers in assembler looks like this:
LDA $20
ADC $22
STA $24
It's a difficult language to use. Assembler is cryptic and requires multiple lines for simple operations. It's not a language programmers use unless they absolutely have to. So what the heck? Why is it popular?
It's popular because it's really fast and small. Written correctly, it will always be the fastest program for any particular chip, and will always use the smallest amount of memory and power. Modern languages require large amounts of processing overhead to manage memory or decipher objects or interpret from an interpreted language to a virtual machine to byte code. That takes a lot of memory and consumes large amounts of power. But not assembly language.
Modern computers have plenty of physical room for gigabyte memory chips and large efficient power supplies. Modern computers have enough unused resources to be able to simulate worldwide weather patterns and still smoothly play cat videos. Modern computing shouldn't care about fast or small. But it suddenly does. Why?
The impact of assembly language
Two things are happening in our computing world: sub-micro computers and restricted power. Not surprisingly, these are two earmarks of the Internet of Things.
The Internet of Things promises computer-equipped refrigerators and thermostats. But think beyond that. Think about computers in tennis shoes. Or computers in coffee cups. How about a computer implanted in every ear of corn in a farmer's field. Even with Moore's law yearly doubling the transistor density, computers will still need to get much smaller. Plus, they'll have to get by on minuscule amounts of power. In a short amount of time, these devices will approach the bare minimum of what can be defined as a computer. What kind of a language will be needed to program these super-tiny computers? The answer is assembly, and that's what makes this exciting!
Tiobe is evidence of a paradigm shift
The fact of assembly language returning to popularity is tangible evidence of a new phase. It's evidence of the integration of computers with our culture. Smart phones made computing a pervasive element of our lives. Sub-micro computers will extend integration to the point where computers become as common as bolts. The use of assembler is like a weather vane pointing to an approaching storm.
What leading indicators so you see in today's news? Which of today's headlines will be prophetic in ten years? What are your thoughts?
With all of that IoT talk, just ordered a bunch of ESP8266's on Ebay. Gotta he awesome!
Thanks for sharing your thoughts Mark. However I must disagree- implying assembly is the language of the future due to the IoT trend is incorrect. The very nature of IoT requires that devices be able to communicate over the internet and perform operations accordingly. Network capability generally requires various layers of abstraction because at lower levels it involves a multitude of protocols, proper timing, navigation, various encryption schemes, etc. Nowadays, due to mobile and IoT hardware requirements, you can surely purchase dedicated network interface devices that will handle a lot of these requirements automatically. But you still need a controller to interface with this device, and as a software engineer why waste your time flipping bits with ASM? Even a language like C will offer better tooling and abstraction at very minimal cost after compilation.
That's cool to see. Though I'd say it's a sign that more drivers are needed for the new IoT hardware. same paradigm where assembly usage is linked to hardware growth/change...
C is really a general purpose assembly language that's easily translatable to the assembler of a specific processor. great for writing low level code, and writing device drivers. Lousy for business (use VB or (yes...COBOL..) or scienticic/ engineering apps (use fortran)
I remember spending weeks writing in assembly language, code which would otherwise have taken someone else an afternoon to complete in C. Granted my code was more efficient to run making better use of memory and CPU cycles; but the trade-off is that it took forever to write. Also, maintaining assembly language is also rather difficult compared to modern Object Oriented (OO) programming. Whilst fondly nostalgic of my professor back in the day sprouting the benefits of low level code; I think its safe to say that the masses will stick to higher level programing paradigms. If only for the comparatively higher productivity yields per person.