From BASIC to Bots

From BASIC to Bots

In the beginning of personal computing, there were many types of computers. They all used the BASIC programming language. Like the ZX Spectrum, Commodore 64, and the one I had, the Spectravideo 328. They were all hobby devices.

In 1981, IBM introduced the PC, and it dominated. We entered a homogeneous world: the same processor, the same operating system, and the same screen, typically 80×25 or 40×25 characters, with monochrome text modes like MDA (80×25 text only) or CGA (40×25 or 80×25 text). While hobbyists continued tinkering, the professional world crystallized. We programmed in C, Turbo Pascal, and Fortran for business and engineering. The mainstream use of Object‑oriented thinking was just beginning to take hold, and C++ was still finding its foundations.

Programs were simple and, more importantly, isolated.

Slowly, things began to change. Graphics cards and color screens arrived. Microsoft Windows and IBM OS/2 followed. Complexity arrived. Computers were becoming more powerful, and we discovered that hacked‑together solutions did not scale. We needed methods that required upfront planning and deliberate design.

That need gave rise to the hand‑off problem. A Business Requirements Document (BRD) was handed off to become a Functional Design Specification (FDS), which was handed off to become code, and finally tested at the end. It was a relay race where the baton was frequently dropped.

The promise of Object‑Oriented Analysis, Design, and Programming was continuity. The idea was that we could start with a model, progressively refine it into a design, add more detail, and ultimately arrive at code. A variety of object‑oriented methods emerged: Rumbaugh’s Object Modeling Technique, Peter Coad and Ed Yourdon’s Object‑Oriented Analysis, the Booch method, Jacobson’s OOSE, and Wirfs‑Brock’s Responsibility‑Driven Design. Each emphasized a different perspective on modeling and responsibility. Over time, people felt this fragmentation watered down the techniques, and the industry rallied around a unified notation and process. UML and the Rational Unified Process were born.

This was the first time we seriously believed that models could stand in for understanding. That belief will return later.

While we were busy managing development complexity, technical complexity exploded. Novell NetWare and client/server computing changed everything. Programs became larger, distributed, and more capable. To keep pace, we were given Rapid Application Development tools like Delphi and C++ Builder.

Then came the Internet. It did not merely add a new layer; it broke the isolated model entirely. Suddenly we had to master HTML, JavaScript, and CSS, while surviving the Browser Wars and making sense of web services.

It was during this period that a revolt occurred. We rejected Big Design Up Front, and with it the rigid BRD‑to‑FDS hand‑off culture. Scrum and Extreme Programming emerged, alongside other lightweight methods. Once again it was felt this fragmentation watered down the techniques, and so the Agile Manifesto was signed in 2001, giving us the twelve principles and, eventually, the now‑famous subway map of practices https://agilealliance.org/agile101/subway-map-to-agile-practices

Agile was a model too. A model of how teams should work, offered in the hope that following it would substitute for engineering rigor.

The pace of innovation did not slow. The iPhone arrived, followed by smartphones in general, then tablets, watches, and goggles.

Next came the proliferation of languages and frameworks: Java, C#, Python, Go, Ruby, Swift, Rust, TypeScript. Full‑stack frameworks like Spring Boot and Django emerged, alongside front‑end libraries and frameworks like React, Angular, and Vue, each trying to manage the growing complexity. We were faced with a choice: native development for each platform, or cross‑platform tools like Flutter, React Native, and MAUI in pursuit of a single codebase.

Development became so complex that we responded by creating an explosion of roles. Business Analysts, Architects, Product Owners, Scrum Masters. Software development began to resemble movie production, with one crucial difference: nobody really knew the script. What to do, when to do it, and how to do it were continuously renegotiated.

New staff now arrive into this storm.

Senior practitioners have it easier, not because the problems were smaller, but because the abstractions were. We remember life before layers upon layers of frameworks, protocols, and tooling. We have a contextual roadmap. We know why things work the way they do because we saw them become that way over time. We have architectural literacy.

A junior today is handed a self‑driving car and expected to know how to fix the engine when it starts smoking. Most of them are doing remarkably well given the situation, certainly better than I would have done if I had started under the same conditions.

Now we add AI. First chatbots, then agents. This introduces another level of abstraction.

Here, the old belief returns: that a model can stand in for understanding. Except now the model is not a UML diagram or process framework; it is a trained neural network.

Developers enter the field and are told they may no longer need to know every syntax detail of a framework. But they urgently need analysis skills, design thinking, and a solid grasp of fundamental systems concepts. They must manage, guide, and negotiate with AI agents. They are expected to possess the kind of judgment and wisdom that traditionally takes decades to acquire, and to have it on day one.

If I had empathy before, I now have fear. Is this fair, moral, or even possible? I do not have an answer, but I believe this is the problem our industry now has to solve.

What do others in the industry think?

#SoftwareEngineering #AI #Architecture #SystemsThinking #Agile

To view or add a comment, sign in

More articles by Keith Crompton

Explore content categories