3 things that are wrong with most programing languages

I mean, all languages are Turing Equivalent (except HTML et. al.). So nothing is fundamentally wrong one language than another. But this article discusses it from a user-friendly perspective, especially beginner-friendly, and the ability to maintain the code, maintain a good maintainable code. And this article is not just blaming a language cannot do the "right" thing, but the language actually allows the user to do the "wrong" thing, really? how ruthless!

Before we discuss what's wrong, let's first see what's right.

Anyone knowing sufficient modern math must be familiar with an amazing math theorem: any math concept can be constructed using only "sets". Sets are like the atoms (or quarks) for math.

Set is an extremely simple idea: you can put unique things together into a collection. Yet, it's extremely powerful such that it can do "everything" because you can put a set into another set.

In some way, we can say that the "programing language" of real math is as simple as sets. And math gave us a good example what a good programing language should be like: 1) few key concepts that can do a lot; 2) And the key concepts must be very intuitive to human minds. Turing machine satisfies 1) but fails at 2) : a piece of tape rolling back and forth that computes anything? huhhh???? That is why no one is programing in Turing language, except Turing himself.

For example, you can construct natural numbers: 0={}, 1={{}}, 2={{},{{}}} ...

// {} is known as the empty set.

Simply by nesting (empty) sets, mathematicians created something out of nothing. The ability to do recursion gave sets the infinite creative power. Put many atoms together and call the set a "molecule"; put many molecules together and call the set a "cell"; put many cells together and call the set a "biological body"; put many people together can call it a "society"; put many planets together can call it a "galaxy"; put many galaxies together and call the set a "universe". The most powerful ability of set structure is that you can never reach the end. You think the set known as the universe is the biggest set? No! You can put many universes together and call the set the "parallel universe" or "multiverse". You can always construct a bigger set! And the self-similarity structure only requires you to understand one idea to be able to understand the small atom to the big universe. It's like a tree (it is a tree). You can put many trees together and construct a bigger tree, and you only need to understand one concept to understand any tree---you just apply that concept recursively.

Seeing from this example, I list 3 features a good programing language should have:

  1. You can create a more complicated program by combining multiple smaller programs.
  2. There is no end: you can always construct a bigger program from existing programs.
  3. There is only one easily-understood structure through-out. You just need to understand one concept and apply it recursively to understand / create anything.

//You might've noticed, these are all "tree" properties. Yes, that's why "tree" is so powerful.

Now let's discuss them individually:

1.You can create a more complicated program by combining multiple smaller programs.

For ancient languages like assembly, you don't have this luxury. All you can do is copy paste code and move them around. Anyone with sufficient coding experience should know this is a stupid thing to do, because now your code gets out of synchronization: changing one piece of code requires you to chase around all its copies and change all the copies of the code manually. But let's get modern. There are 3 modern programing paradigms. For procedural programing the unit is coroutine, and you can combine coroutines. For functional programing you can combine functions. For OOP, the unit is object, or "class". And it's tricky how to combine class. There are 2 ways you can do this: inherit multiple classes or instantiate multiple classes inside the class. The problem is that only the latter way is the right way, but many programmers think it's the former due to laziness (you just colon the class you want to inherit and magic gets done!). For many languages, multiple inheritance isn't allowed, for other languages, it has the "diamond problem" (google it). I'm not blaming single inheritance; I actually personally think it's great because it just bans the users from doing the wrong thing. And what's wrong with OOP language like C++ is that the language doesn't make it clear, and many people were even taught to use inheritance the wrong way. And inheritance has some fundamental limitations: B and C cannot inherit the same instance of A. The compiler automatically creates two copies of A for B and C separately. Also, you cannot inherit the same thing many times to create many copies of it. Class Team:Man, Man, Man, Man doesn't work. The abuse and misuse and misunderstanding of inheritance created so many awkward codes (google it). The number of programers doubles every 5 years, that means there are half of the programmers in the market who have less than 5 years of experience. If it's not clear by design of the language, and the language allows the user to do wrong things, the language is awaiting for more bad code.

You are facing more inconvenience when you "create a more complicated program by combining multiple smaller programs" at the per-program level. How do you combine two .exe to create a bigger .exe? How do you combine other people's coding projects and use them as your library? Just copy their code to your computer hard drive? NO!!! That is the "copy paste code around" problem, because it's hard to get your library up-to-date. I'm not saying there's no solution, but there's no easy solution that is intrinsic to the language. Github has a running business for solving this problem.

2. There is no end: you can always construct a bigger program from existing programs.

Static and Command line terminal are the source of all evil. We're all taught global is bad, but do you realize static and singleton variables are just plain global. We're all taught global is bad, but why is global bad? I was taught that global is bad because global can be masked by local when there is naming collision. But the biggest problem with global is that once you hit global, you cannot be more global. If you wrote all your code thinking there is only one universe, and Stephen Hawking realizes there is a bigger multiverse containing multiple universes, you lost your ability to upgrade your code because you already declared there is a biggest scope: global. You may think sky's the limit, but what if you really hit the sky sooner than you think? Why are you setting yourself a limit how powerful your code can be? If internet explorer thinks (I'm making this up) in one web program there is only one web page and made it singleton, the next day Firefox realizes you can open many tabs in the same program. IE needs to rewrite the whole code to correct all those singleton usage to catch up to Firefox.

How to be more static than static? If you write your code with a static singleton pattern player, and you need to upgrade your game to multiplayer mode--- you're screwed. Once you write your code in main(), how do you combine and reuse your main()? Many think static variable is bad, but static function is okay. No. If you write your static function using any static variable, you're risking yourself for unwanted "memory side-effects" that links unrelated things. Also, you cannot override function, so you cannot multi-version your function, and that makes your code reuse difficult. Say, you are using a third-party DLL's static function. You upgraded the DLL for a must-have new feature, but the third-party's new upgrade introduced a big bug in this static function. What do you do?

Many data-science / scripting languages like R and MATLAB came with a command line terminal. That thing is pleasing, isn't it? You type the command and hit the "enter" key, it gives you the result in real time, saving you the time to compile it. But the problem is that the "command line terminal" is your biggest scope. You cannot reuse your code in the terminal, and you cannot modify it, subversion it. What you should really do is to write your code in very small batches (in the unit of function or class) and execute them frequently. You can combine functions to get function, you can combine classes to get class, but you cannot combine terminals to get terminal.

3. There is only one easily-understood structure through-out. You just need to understand one concept and apply it recursively to understand / create anything.

Why are there so many different things? Static, variable, const, function, data, parameter, pointer, class. We should be programing using only XOR gate!

No, I'm joking.

OOP gave us a great promise: everything is an object (class). Except, it's a lie.'

OOP programing languages has inconsistent programing paradigm at micro, meso, and macro level.

  • At micro scale, OOP is just procedural programing; these tiny procedural codes make up a class. I'm not saying this is wrong, or I have a better invention that solves this problem, but I'm just pointing out the inconsistency.
  • At meso scale, it is the real OOP. Classes interact with each other by sending messages to each other. But there are 3 ways to "send messages". 1) one object can all the public function of another object. 2) one object can send a "message object" to another object's universal message receiver. 3) one object can sign-up to listen to events of another object, and this is functional programing.
  • At macro scale, they are those god-like static things. Many programers use static singleton pattern, and static function calls. And even the start of any OOP---constructor---is intrinsically static. The main() by itself is static, and in order for main() to do anything, it needs to use constructors to create object, which is also static. Static is certainly not natively OOP though! The first lesson "hello world" of OOP is not even really OOP, it's just doing procedural programing inside main(), which gave many beginners the wrong impression of OOP in the first place.

You see, OOP doesn't have a consistent programing paradigm. It claims everything is an object, but it allows all those non-object weirdos. It claims objects interact with each other by sending messages, but it doesn't even make it clear how to send messages.

int a =5; int b=a; now if I change a, does b change? No.

But should they if they were classes? Yes. And this is the inconsistency. Oh there are two types of variables, reference type and value type. 99% of the high level code behaves like reference type but once in a while when you get down to the basics the variables will behave differently. How to create two entangled int such that one changes with the other? Now the user needs to learn some pointer trick---inconsistency!

For OOP language, I can always use a subclass to replace a parent-class (greatest invention in polymorphism). But how do I subclass int? Say, I want to create class LazyInt : int. Such that its value will be lazily evaluated as late as possible, and use that in place of a deterministic int. I can't do that!

I mean why don't we just follow OOP's promise: everything is an object (class)? We can actually create a Turing Complete language using only Class, why don't we? Oh, performance reasons... Really? I think modern computing speed is nearly infinite for everyday usage. C# did a good example for this: if you really understand what you're doing, and you're doing pointer magic for performance reasons, declare your code to be unsafe explicitly---otherwise, the compiler refuses to compile. But C# thinks only pointer is unsafe, and I think all those things I mentioned above are unsafe, which should hide from beginners / bad programmers.

================

Hi, I'm John Li. I've been programming for only 3 years, but I'm a scientist, trust me. I want to upgrade the world more efficiently on the way we program. I'm inventing a new language "C<" that solves all those problems. It reads "C-less", and it makes programming code-less. With code, no matter what super language you invent (e.g., JULIA), you're limited by the typewriter invented hundred years ago. Let's move codeless, not only the language is OOP, but the programing of the language should also be OOP.

Have you tried Scala?

Like
Reply

John Wyatt you're invited to read my new article, and I'm interested to learn what you think. 

Like
Reply

To view or add a comment, sign in

More articles by John Li

Others also viewed

Explore content categories