Different is good
A PDP11/70. This isn't really about PDP11s. Soz

Different is good

Different is maybe good? (For business retention, if nothing else.)

IBM ruled mainframe computing till the 1990s.

Everyone knows that?

Well, except for ICL (British, then acquired by Fujitsu.) whose systems, if not company were still in business in the 2000s. They had VME, the successor for George III, from their acquisition of ICT in the 1960s (The operating systems George III,II,I were quite advanced for the 1960s.)

They decided to add support for UNIX software in 1985. (They were a bit strapped for cash at the time and it was the latest thing.)

There was more than one… well at the time (80s) there was what were called IBM and the Seven Dwarves. (Burroughs, UNIVAC, NCR, CDC and Honeywell, RCA and GE.)

All of which are gone now. DEC, HP, Compaq all merged into HP, and became little more than a maker of inkjet printers and generic PCs. The HP electronics test equipment business renamed themselves as ‘Agilent.’)

Well, and the other one.

DEC (Digital Equipment Corporation) made computers from the 1960s till they were acquired by Compaq in the 2000s. They were a spinoff from the US Airforce’s SAGE air defence project, in a manner of speaking. (SAGE was a failure, oddly enough. They went with the backup-plan, but also, light-guns.)

DEC started with Personal Data Processors (PDP, so named to get around pesky government procurement rules about computers.) and they were hugely influential, leading to such software as ‘C’ and ‘UNIX’ (A free, alternative operating system for the PDP-8, from Bell Laps at AT&T) Which resulted in the C programming language (q.v.), and in the 1990s, a free clone of UNIX was made (AT&T had decided to monetise Unix so it wasn’t free.) by a Finnish student, called Linus Torvalds, he called it Linux, and now it runs every cellphone that’s not an iPhone. The iOS on the phone is a rebadged NeXTStep (circa 1992) , which is… a BSD Unix with a MIT MACH microkernel. (Apple subsequently replaced the kernel, and have spent decades rewriting everything to give their engineers something to play with, as much as anything. Oh, and ensure it always gets bigger and slower… otherwise nobody would need to upgrade to the new device. Or they’re stupid, you be the judge?)


DEC developed many, many, operating systems, but mostly they are remembered more fondly for the PDP’s and RTS (later PDP’s were 36 bit systems, because earlier ones were 18 bit. They were 18bit to have two more data bits than address bits, for instruction encoding reasons. It seemed like a good idea at the time, back in the 60s. As did using octal.)


DEC, having success, developed the VAX family of computer systems, with the VMS operating system. That wasn’t at all like the PDP series, and 32 bit, not 36. (32 bit to be like IBM, and 4 eight bit bytes.) Customers went away. Apparently being told what you want isn’t ...what customers want.

(For all that, VAX/VMS was actually a good system, miles more reliable, secure and usable than anything on microcomputers, and Unix was regarded by VMS people as ‘a dreadful little thing. The worst possible solution to every design decision.’ (At the time, Windows was version 3.1, so a complete joke, and Apple computer didn’t even make Macintoshes yet.))

There was, therefore, Unix for VAX, obviously.

DEC lost customers when they got rid of 36 bit systems. The VAX systems were quite pricey, and Digital were notorious for their byzantine sales processes. (Aping IBM, who had picking the customer’s pocket down to a fine art. IBM mainframe Customers still pay for CPU-seconds, on machines they own. (But cannot leave, because IBM mainframe systems are so different to anything else.))

Again, why 36 bits? Because the one before was 18 bits. Which was 16 like the address bus, plus to for tagging as that seemed important at the time. And multiples of three bits worked well in octal. The PDP family certainly had groovy front-panels. Much orange, such piano-switches, such wow.

A little bit about Unisys (Sperry and Burroughs merged.)

The Unisys mainframes had an ‘alternative design’ with hardware array bounds checking. That was leveraged extensively by the operating system, and every application program. Unisys mainframes, therefore had an enviable reputation as supremely reliable and simple. The software still runs, (under emulation) on commodity Intel x86_64 processors today. Unisys polled their customers in the 2000s, asking if they were interested in a version of the virtual machine without bounds checking, as it would be faster. Eyewitnesses reported Unisys stalwarts saying “not bloody likely, it catches every error!” Thanks to the evolution of C, from Unix and the PDP-8, in the mainstream, we have unbounded arrays (causing most of the memory corruption bugs experienced today as security flaws), and heavy use of dynamic heap-based structures, which are responsible for ‘use after free’ bugs; vide security flaws. The heap structures require pointer-walking which runs poorly on modern, heavily cached and pipelined CPUs. (There are workarounds, but fundamentally that sort of software ran well on computers of 20 years ago, not today.)


L3-Harris, who these days only make expensive aerospace electronics, also make computers (that they use inside their equipment), but they’re very different to everyone else’s. They’re stack-based architectures, with programmable microcode. In that respect, very 1970s, like a Xerox Alto or Star. And, they have, you guessed it, hardware array bounds, (and stack bounds) and for the justifiably paranoid, the CPU will only start executing if the memory checksum of the ROM passes. (And the hardware that tests that, holds the CPU off the bus till it’s sure.) Why are these Harris CPUs so serious? Well, they’re the processors used in things like the Boeing 737 series of aircraft, and it’s good that an aircraft CPU only runs if it’s okay. It doesn’t offer the sort of performance you get from anything normal for C programs, so mostly nobody’s interested. And a C program cannot use the hardware array bounds, as the PDP-8 didn’t do it, so it’s not a thing. This might be the pinnacle of an idea called ‘worse is better.’

‘Worse is better is,’ in fact, distilled sour grapes. The simplicity of Unix and C (it expects nothing from the hardware but linear memory and a hard drive) meant that it was easily ported to new computers, and in the early 80s, the first AI boom had just busted, and a number of smaller companies had new, smaller personal workstation sized computers, but no operating systems or software. They had been mostly intended to run LISP, but that turned out to be slower than Unix on the same hardware.

And suddenly, with processors like the Motorola 68000 being available, ‘workstation’ computing was a thing. That meant UNIX, and possibly something useful like Computer Aided design; the workstations were powerful enough to do it.

The 68000 has a design flaw that makes running UNIX on it hard, but that’s what a 68010 is for, as they say at Motorola. Some instructions aren’t restartable. (One bunch of nutters ran two 68000’s out of phase to simulate being able to restart instructions on bus errors. It worked so maybe that’s not stupid.)


Once the industry had settled on ‘CPU does computation, safety-rails unnecessary’ as a model, CPU architects could go crazy in the pursuit of more speed. And so they did. (Xerox did everything first, with not only a GUI but also no safety rails, and lots of crashing in the Alto and Star, which as the saying goes, did everything first. Like GUIs and object-oriented programming, ethernet and laser-printers.)

(Starting in the 1970s, the microprocessor, at 8 bits made for a ‘home computer’ market, and they really mostly played games. (And Visicalc which was important later.)

The IBM PC and the end of ecological variety.

The IBM PC came out in the mid 1980s, and it was widely regarded by computer pundits as ‘crap.’ But IBM marketed it as serious computing for serious office jobs. The 1977 vintage Apple II was making fast inroads into business because of Visicalc. (Spreadsheets came from CP/M but CP/M systems were physically bigger, and more costly than Apple-II, which Apple produced in vast volumes at ever-diminishing cost.)

Apple was even out-selling the IBM PC with the venerable Apple II, first shipping in 1977, for a while.

[Having benchmarked it, in person, an Apple II is actually faster than a stock IBM PC. 6502’s have a two-phase non-overlapping clock at 1Mhz, or as we call it in the trade, a 4MHz clock, and are more efficient with those clock cycles than an 8088 running at 4.77MHz… Intel memory bus cycles of that era take 4 clock cycles minimum, so, that 4.77 quickly drops to 1.19MHz, and the 6502 has all those zero page instructions to go fast with. True story. Really did benchmark it.]

Apple had Steve Jobs and Steve Wozniak, but Steve Wozniak suffered closed head trauma as a result of crashing his aeroplane. (Amateur pilots kill themselves, it’s a thing. Steve merely isn’t quite who he was any more. Oh, and he is a retiree these days.) Apple floundered about, milking the cash cow Apple II till it was obsolete. (And Steve Jobs, and everyone else in computer company management took a lot of recreational substances.) Apple made the Apple III, which was heavy and hugely unreliable for simple, avoidable reasons called ‘you don’t have engineers who know about thermal expansion.’

(Apple were not alone in that in America, as the US Department of Energy, which makes all things nuclear, made a design of nuclear fuel rod with a very similar design error for simply ages; they only warped, not exploded. It got fixed, so don’t panic, but fundamentally as a non-American engineer, there’s an observable tendency in the USA for engineers to occasionally fight the laws of physics. (Except at Boeing, where they make planes that fight the pilots.))

Article content

The next Apple computer was the LISA, a Steve Jobs pet project, which was basically Apple doing an Xerox Alto and nobody bought it because it was expensive and didn’t run Unix. So where would you get software? (It was named after a daughter of Steve Job’s that he denied being the parent of, at one point, because child support payments. Steve had also discovered a number of exciting loopholes to do with buying cars and not transferring ownership so he didn’t get parking tickets. Later, as he grew richer, he just some minion pay his parking tickets and ignored the inconsequential fines. Most rich people do this; it’s just more efficient for them than wasting time going to a ticket machine.)

At the same time, there was a project to make a new, ‘Volkswagen’ of a modern computer at Apple, called ‘Macintosh.’ because that’s the most popular apple (fruit) variety in the US, apparently.

Steve Jobs muscled into the project and bullied the engineers into making it a sealed, un-expandable box. With, in the next revision, proprietary connectors, and forced the specs down to a cost, losing first memory protection, and secondly, reducing RAM to 128k; it worked but you really needed more RAM, and as second floppy; propitiatory floppy, and 3.5 inch disks that were uncommon at the time. Original MacOS therefore was a microcomputer OS, for a powerful processor with no hardware support for memory protection… and they cut every possible corner for quite impressive speed. (So many corners, you had to buy a Macintosh XL (which was really a rebadged, Apple LISA with a different operating system on it) to do software development for Macintosh. Oh Apple, never change. (They haven’t. Though the Macintosh II that arrived a few years later was an open-ish design, it wasn’t in the market long.)) And to this day, you must buy a Macintosh computer from Apple to do iPhone development.


Two Flavours of Apple: Jobs and Wozniak.

Apple had made all that money out of the Apple II, which was an easy-open, easily expanded box. The lid literally comes open with what is basically jumbo Velcro. For a computer made in the 1970s, being able to have … megabytes of RAM, sound cards, better graphics, floppy and hard disks, wasn’t something Steve Wozniak imagined… but he designed a fairly future-proof, cheap way for expansion to be easier than CP/M machines, which were the contemporary competition.

That was one of the selling points, who’d have thought. You just put the expansion card in, and it works, no jumpers, hardly even any drivers needed. (There’s an expansion ROM designed into the expansion card mechanism, and good doumentation.)

Amusingly, the IBM PC’s ISA slots have more in common with CP/M’s S100 bus, than Apple’s. On the other hand, PCI, which replaced ISA and begat PCIe, the modern bus of choice, takes a little after Wozniaks’ Apple-II slot. Huh, who’d think that. (Steve Wozniak worked at HP and Atari, so he was inspired by experts.)


Back to Mac

The Macintosh, despite it’s price-tag and limitations, created the Desktop Publishing business, (because Apple released better Macs with more RAM, 512k, then 1 megabyte) and a Laser Printer that worked with Macs called “LaserWriter”. Done and dusted. You could easily do publishing for a book on a Macintosh Plus, which came with an internal hard disk. Graphic designers and tech writers flourished. (The laser printer was a Canon laser printer mechanism and another 68000 based computer from Apple, sans floppy or hard disk.)

The first generation of Macintoshes, using Motorola 68000 series CPUs ran out of performance gains, as Motorola didn’t have the sort of chip fabrication Intel did; but neither did anyone – Intel was kicking everyone’s butt’s with the 486 and Pentium. Apple invented “Apple Desktop Bus,” a four-wire serial bus for keyboards and mice, that was nothing like HP-desktop bus, a four wire serial bus for keyboards and mice. It did lead to USB eventually – a new and innovative four wire serial bus for desktop peripherals --, so let’s just be thankful for something. (Just don’t look at how it works, it’s a bodge.) (SCSI, which had the high ground for peripherals in the late 90s and 90s, was an originally 8bit parallel bus for peripherals that is nothing like Hewlett-Packard’s HPIB also known as GPIB, which is an 8 bit parallel bus for peripherals. Let alone that the industry standardised on copying Centronics parallel printer hardware interfaces for so long, Centronics lost it’s dominant position in the market long before 8 bit parallel busses to printers became obsolete. (And that’s why early PC laptops had Zip Drives and Ethernet adapters that plugged into the ‘printer port’ – the later PC models have a bidirectional port a lot like, say, SCSI or HPIB.) )


At around the same time Commodore made the Amiga and Atari the Atari ST, both with 68000 CPUs. Obviously, they couldn’t run Unix, because 68000 had that annoying design error. So Mac, Amiga and Atari ST all had operating systems without memory protection, so they were about as trustworthy as a Windows 3.11 PC. In other words, they crashed often. Programmers variously tried hard or not, and software crashed because there was no room for error, and no protection from errors or mutual incompatibilities. Firewalls between programs matter, it turned out. (As the Mainframe computing industry worked out decades earlier.)


But IBM had used an Intel CPU for the IBM PC, and Intel made the (flawed but still faster) 80286 and IBM made the PC/XT, which was a IBM PC with a hard-disk, and then used the 286 to make the PC/AT. Which was vastly faster than any of the 8 bit competition, and came standard with a hard disk drive. Serious computing had arrived at under $10,000 (US) per unit.

And the IBM PC, PC/XT and the PC/AT which didn’t really have any custom chips in them, were cloned by manufacturers, and Microsoft helpfully licensed the cloners copies of MS-DOS that didn’t need the software IBM had locked up in patents. The OS IBM was selling, PC-DOS that needed ROMs that IBM owned the patents on; but it was a Microsoft product, and Bill Gates was smart enough to leave a loop-hole he could make money from.

This led to ‘commodity PC computing’ where profit margins were shaved to the bone, and most ‘Mainstream’ PC manufacturers either went broke, or shifted all manufacturing to being OEM’d in far east Asia. The “Turbo XT” clones were cheap and nearly as performant as an IBM PC/AT.

On the up-side, every clone sold had a copy of MS-DOS, so Microsoft made bank. (And anyone selling a PC had a physical address, because computers are at the time, suitcase-sized, so Microsoft Legal could attack with Copyright infringement. And Copyright law was part of the bastion of the big end of town, so Microsoft flourished. Let alone the massive network effects from pirate copies of MS-DOS had on Microsoft’s market share.)


Sales of the ‘286 in AT clones were so good, Intel –who were getting paid for the chips, no matter who sold the computers – had money to develop the 80386, which was a ‘proper’ 32 bit processor, and as performant as most of the workstations made a few years before. (But importantly, cheaper, because volume production means those huge, one-off chip design costs can be amortised over more units. As could the phenomenally expensive chip foundries.) They still made it weird though, because they liked copying IBM mainframe designs badly, and every time they did a new CPU architecture they’d made up themselves, it flopped. (i860, i960, Itanium.)

Compaq actually released the first 386 PC compatible computer before IBM released their 386 based PC. This marked the beginning of the end of IBM’s control of the evolution of the PC.

And mass-production of ‘386 clones drove ‘progress’ and Intel developed the 80486, which was nearly fast enough to be considered a ‘Proper engineering workstation.’

IBM released the “Personal System/2” with new, different keyboard connectors, a standard mouse connector, and importantly, a really proprietary expansion bus, which required expansion card manufactures to buy parts from IBM. As a percentage of the market, the PS/2 barely made a dent. (And for what can be best described as ‘reasons’ PS/2 systems need special setup disks to cope with hardware being added. Apparently copying good ideas from Steve Wozniak is hard, or something.)

Companies like Compaq made ‘386 machines with PS/2 keyboard and mouse connectors, and VGA compatible graphics, which was all they took from the PS/2 design. IBM, obviously sold very few keyboards, mice, or screens to the rest of the market. This marked the beginning of the “PC Compatible” industry doing its own thing; with a little help from Intel and Microsoft, releasing specifications like firstly LIM in 1985, then later, ‘PC-97’; that’s why your sound jacks were coloured on tower systems. And thus P-98, PC2000, and so on. (LIM was Lotus Intel Microsoft. Spreadsheets were important.)

As everyone, bar rounding errors, by this point was running Microsoft Windows on their PC, Microsoft and Intel and to a lesser extent Compaq and company, decided on the design of all future parts of the ‘PC Compatible.’ Well, unless you did desktop publishing, in which case it was a Mac.


In a fatefully ill-starred choice, Apple decided on a new processor to replaced the Motorola 68000 family, and went with IBM, in partnership “PowerPC” processor and ‘Common Hardware Reference Platform’ CHRP.

Power PC (PPC) was, in theory faster than Intel’s 486 and Pentium. But the rivers of money Intel were getting selling 486s let them make better chip fabrication, so faster, less power hungry chips with more transistors, and PPC quickly got a reputation as running hot, and slower than the latest Intel chip.

Oh, and Apple couldn’t port their MacOS to PPC entirely, so they emulated it, and it ran like a dog.

(Which led to a lot of the reputation for slowness.)

(For those who wonder, MacOS versions under 9, when ported to PC hardware ran like crap. (It was a skunkworks project at Apple.) Firstly, ‘mainstream’ PC hardware was a lot slower than Macintosh, mostly in the matter of graphics memory bandwidth. There basically wasn’t any, bandwidth that was. (John Carmac, founder of ID Software, the creators of ‘DOOM,’ designed a very clever partial workaround for this terrible bandwidth to the graphics memory later.)

Apple kicked Steve Jobs out, and in a bit of cartel-like machination when he wanted to make another computer company, agreed that ‘Next Computer’ wouldn’t compete with Apple. Steve Jobs said “I’d rather have ten thousand gold customers than a million nickel ones.” Cocaine is a hell of a drug; the maths Steve’s talking about don’t work – mass production is where the money is. Design costs are the big bit with computers and software, even if commodity hardware is a race to the bottom for profit margins.


What happens NeXT?

So Steve Jobs founded NeXT Computer Inc. (Pretentious capitalisation required.) The first NeXT machines were basically … 68020 based Engineering Workstations. With Jobs-esque needlessly expensive magnesium alloy cases. (His new bike was magnesium alloy and … he didn’t do compromise, he did coke, and bullying employees.) He hired a number of geniuses from Apple; hence the non-compete agreement, including, famously someone NOT currently working for Apple, who had, in the past. (The Valley CEO cartel kept price-fixing salaries and restraining employees from free movement for at least another decade, until Google, Facebook and co started breaking with the cartel… a bit. If you do a billion dollars worth of business on the work of one engineer, what is fair compensation for the work? Answers on a postcard, please. Assorted geniuses in software have made various big name companies at least that much more competitive/cheaper to run. Strangely, CEO’s and the management caste seem to have captured all the excess profits? It’s like the engineers need organised labour or something.)

NeXT as an Operating system ran on: Apple hardware, NeXT’s actually relatively generic Motorola 68020 hardware, stock standard PC Compatible x86 hardware, and later DEC Alpha, Sun’s SPARC and HP’s workstations. NeXT made a few more models with 68030 and 68040 processors.

You read that right – Steve Jobs at NeXT actually had an IBM Thinkpad on his desk. ( I can personally attest that NeXTStep version 4.0 runs great on an early Thinkpad.) With proper, TCP Ethernet networking, and sort-of Unix.)


This all happened before Windows 95. Seriously, in 1993 on PC hardware. But priced at numbers that nobody could afford.

NeXT had some software developed on it; notably DOOM, the first big first-person shooter, and the first Word Wide Web Browser, and first world-wide-web server.

So, apart from AAA games, and the internet as we know it, it did hardly anything.

(And Lighthouse Software made a couple of really amazing drawing and writing programs that were acquired by SUN Microsystems and killed. Who knows why? Nobody knows.)

NeXT switched to a ‘pure software play’ later because hardware is not where the margins are.

Steve, obviously priced NeXTStep as an operating system out of the market at over $6000 per user. For an operating system. On CD.

Cocaine, as I’ve said many times, is a hell of a drug. (The history of computing in the 80’s and 90’s makes sense viewed through the lens of ‘All executives made decisions under the influence of cocaine.’ Much like modern 2020’s billionaires and Ketamine.)


Apple, again.

IBM PC Hardware was getting faster in the late 90s, but everything software was basically last-years crap, faster. Oh, and “Multimedia” which was jerky playback of cut-scenes from CDROM.

Apple, with a succession of CEOs who didn’t know anything about computers, had tried and failed to make new operating systems, having partnered with lots of companies (IBM & HP, honestly!) to do that (doesn’t make sense, but drugs, and the management caste as leadership, appealing to shareholders who are from the management caste.)

It wasn’t till Apple bought Jobs’s NeXT computer company in the late 90s that they got an operating system to make real progress with. (They put a Macintosh badge on it and called it OS X Server) It’s also the basis of both OSX and iOS. (And cynics would say neither really does anything NextStep doesn’t, except need more RAM and CPU.))

But at least Apple reinvented itself as hip and cool and… sold sealed boxes that stop working after a few years. Steve Jobs & Jonathan Ives sealed the first iPhone so you couldn’t replace the battery, and moved on to sealing the battery into laptops. And … this is how you get the present day. (And several times the tonnage of e-waste per annum. Yay, progress. Yes, Steve loved him some sealed boxes.)


And with the advent of the Pentium and more importantly, the Pentium II processor from Intel in the late 90s, buyers of actual Unix Engineering Workstation computers had a choice; Unix-based systems, or PC compatible. The Pentium II PC’s were 60% of the performance, for 25% of the cost. (Admittedly, that was a bleeding-edge twin CPU Pentium II. Yeah, twin engines. Broom Broom!)

That pretty much killed the Unix Workstation market; and the name “Workstation” was reused later for high-spec PC-compatibles. (Which are, for a purist, just jumped up IBM 5150 PC’s with a lot of upgrades. If you’re feeling generous, Compaq 386es on steroids.)

Of course, by the time the 486 came out, companies like Novell had made a market niche as ‘Servers’ that weren’t Unix systems, that served... file and print to PC’s on people’s desks.

Where did the Unix go?

Well, AT&T decided to charge money for it, so people switched to Berkley's version (Berkley Standard Distribution or BSD), till the regents of the University of Berkley wanted money for it too.

The only version that ran on PC Hardware was Xenix, which was bought by SCO who tried to extort money from everyone till just after the sun ceases to burn, for infringing their patents.

Weirdly the 286 was supposed to run XENIX, we’re told. Nobody really did because DOS was cheaper or free or bundled with the new PC. XENIX ran on 386’s but was a pretty crappy Unix, even by Unix standards, and expensive.

So PC servers ran usually Novell Netware.


The ‘PC Server’ market quickly grew (They were cheap and worked okay) till the introduction of Windows NT Server. That was essential if you wanted to use Microsoft's new Outlook mail and calendering system. Which was the new hotness. (Novell’s equivalent wasn’t marketed as well, or as cool – it didn’t do calenders!) and Lotus’s Notes made some headway, and caused many programmers much grief.) (Trying to program with Microsoft’s Outlook turned out to be maddening, as many essential features (like scripted rules triggering on messages) just didn’t work reliably– and still don’t!)

Microsoft basically killed Novell with the introduction of Windows NT Server, and that led to quick sales of server hardware, as admins discovered you couldn’t have just ONE server, you needed about six, thanks to Microsoft’s inability to write software that worked reliably. (Windows servers run great virtualised, or all alone, on a developers test system at Microsoft. They used remote procedure calls heavily, and until about Windows NT 4 version 5, RPC was a bit intermittent. Or if your network dropped packets or delayed them, you know, like real networks. And needing a reboot to fix anything made physically splitting up servers into separate file, print, Exchange mail, and domain servers essential. Oh, and it turned out NT networks didn’t scale past about 20-30 workstations, without using VLANs. Whoops. They liked making broadcast packets, which gave the recently invented Ethernet Switches, the equivalent of a traumatic brain injury. It did, as the developer said, work great on my desk. Anyway, bundling calendering into Outlook and making it Exchange-only was a stroke of marketing genius. (Making it impossible to export mail out of Exchange server to some other mail server, the cherry on top. Retain users, or die. And NT could use your Novell server for login authentication, so what anti-trust action, officer?)


Apple did a cell-phone, which was weird, and apart from a really good touchscreen, kinda pointless.

It was like their iPOD music player, where they’d radically put a hard-disk in a music player and had one with well, boat-loads of room.

They used ARM processors, because NeXT was portable to different processors.

And a year or so later, they even had apps for it.

Their competition had, to be blunt, started from worse software than Apple did, and executives did dumb things. Apple stuck a 30% fee on all music and apps from their online store, and quietly transitioned to making their money as a middle-man. And on the apps too, of course!


Apple finally abandoned PowerPC for Intel CPU’s and nobody cared; the Mac was cool for it’s OS and industrial design.

They later went on to spend a pile of money designing the A2 Processor, which is genuinely good, and drumroll please, was designed with an eye to security. They switched the phones and tablets to use that too.


And one last thing; UNIX?

It’s a play on MULTICS. The AT&T guys that wrote UNIX didn’t like the MULTICS project, which was government sponsored and tried to make an operating system that could run multiple users, working on data that should stay private, per defined groups. Or maybe even multilevel security.

Unix’s take on that on group security was very token, and the original security of UNIX passwords was intentionally easy to break; if you could work it out, you were a cool hacker, and deserved total control of the computer. And by intentionally easy, a simplified version of the ENIGMA cypher machine, as used by the German military from World War Two. (As in, a stripped-down two-rotor Enigma, not four, or five.)


UNIX shipped first, before MULTICS, and is still alive today. Sort of, because Linux ate it. Because Linux was cheaper (free is cheaper than any other number), and ran on commodity hardware. There are commercial Linux’s for those that want software support and security updates are too. (Though free Linux’s get constant updates… which probably won’t break everything.)

MULTICS, despite the slandering, got finished, and did work, and more importantly, some very important research in side-channels in computing got done in the 70s, and the US Government decided that physically separated computers was the only way to keep data safe.

The lessons learned there and in related places didn’t make it out of the ‘military industrial complex’ and by the 2000s, servers were running virtual machines that, it turned out were all vulnerable to the ‘Specte’ and ‘Meltdown’ bugs. Which were hardware side-channels, something that had never been seen before. Well, apart from in security engineering, but they’re all weirdos.

And fixing Spectre took Intel … well, the latest CPUs don’t do it. Well, if you use these new instructions to flush caches religiously, they hardly do it.

This is why Windows 11 wants a newer processor – the last few generations of intel’s x86-64 do, finally have protections that Intel are pretty sure mostly prevent data leaking from one VM to another. And Windows 11 uses VM’s under the hood to put programs running possibly unsafe software in a separate VM.

The nice IBM mainframe operators do, indeed find that amusing. (Yes, they’re condescending about it, but they’re often condescending.)

Not that they let randoms run software on their mainframe.


Wait, What?

Well, because Web pages have Javascript, browsing the web is really just letting randos run software on your computer, but with more steps. (And a terrible programming model because Hypercard wasn’t networked, but never mind.)

Oh, and PDF documents have Javascript in them too, so reading a document is running code. (Yes, you can run Doom inside a PDF. And I checked when I wrote this, just to be sure. Yup, Doom in a PDF.)


And remember how we made everything since the PDP-8 not have strict memory protections like those fuddy-duddy Unisys systems?

Well, 95% of all bugs in software are due to memory bounds errors, or use-after-free. The remainder are mostly dataflow trust errors. (Uh… yes, some obsolete operating systems just couldn't trust user data by mistake.)

So bugs, yeah, nothing new.

It’s just that these days, any bug is a vulnerability that exploit developers will use to build what amounts to a programming language out of your software you didn’t thing was programmable. Of course, that only matters if it’s always connected to the internet.


(Or reading a 2d Barcode, or in range of a new Wi-Fi access point, or ….) Because data+bugs = programming language, and safety was left behind decades ago.


Surely there must be something?

What was adopted into the mainstream was page-based memory protection, where in (say) 512 byte chunks, your computer can keep track of if this program right here should be able to read it, write to it, or even, and Intel had to be dragged kicking and screaming to do it, execute is as a program. That seems like a lot, but ‘came from a user, don’t trust it’ is a very valuable thing to know too, and embarrassingly, the kernel of every operating system in common use has access to everything. That, unfortunately makes the kernel have to trust itself, like a 1980s vintage Macintosh or Amiga. It’ll run right till it doesn’t. The “microkernel” or hypervisor could help there, but even Apple ditched the MACH microkernel, in pursuit of more speed. (Of course, in hindsight, they could have just got faster chips, but being slow is what killed the middle-era Macintosh on PowerPC processors, so Apple had a fixation, well, Steve Jobs did anyway.)


Apple aren’t alone in this; Windows NT 3.51 ( actually version one, but Microsoft Windows was already on 3.1.1, so it looked like an upgrade.) had the graphics driver in a separate protection ring. (It included things like screensavers: running some random’s screensaver code, and fonts: running the TrueType font code. Yeah, the screensaver and fonts are effectively code.) NT 3.51 was quite reliable.

For more speed, Windows NT4 put the graphics driver back into the kernel. That led to 20+ years of windows graphics-related security bugs. Which is sad, as 3.51 ran okay on decent machines in 1998. (3.51 actually was too good, as it let people run full-screen DOS programs like games or CAD software, without using anything from Microsoft at all. Yeah, hardware emulation. It totally worked. (I used NT to emulate bare-metal PC hardware in the late 90s, early 00s.) But that meant people were just running software without Microsoft, so there was nothing locking them into Windows later. Microsoft had made a fortune on DOS, behind IBM’s back, they weren’t letting people leave them.)


So ah, yeah. Even as you read this, your computer is running code from whichever website hosted this document.

Is that the real website? Eh, maybe not. DNS/Certificate security is a mess. Thanks Google. (They make it impossible to discriminate Google Trust Services (GTS) signed third parties from legit Google servers. Same root of trust. Selling point compared to the new upstart, “Let’s Encrypt.” who’ll sell anyone a certificate. Google will too, only theirs are impossible not to trust without pulling the plug on the big G.)

Did an ad company inject hostile content into the page?

Guess you’ll find out later.


Ciao.


Go read “Software updates please,” to find out why no, updating a few time a year doesn’t cut it when all software security is made of hopes and prayers.

Don’t worry, your home router never gets any updates either. And it’s permanently connected to the internet. Doh!


And one more thing for your thing.

And your Intel x86-64 processor has a processor to boot it, because I heard you liked processors, so Intel put a processor in your processor. It runs software you can’t see, and can’t disable, and has network access, and can snoop on your main computer. Intel occasionally admit to security flaws that let anyone on the same network take your machine over permanently, and without you knowing. But it’s probably fine, because paranoia is very time-consuming. They don’t offer updates to fix that if it’s too old, because reasons, and no government that understood that paragraph you just read exists, that has the leverage on Intel. (Who did, in all fairness make it nearly impossible to fix it, because of their business model. The software they need to update is digitally signed by the people who made your motherboard, and they don’t want to issue a BIOS update, so you can use an older computer. You should buy a new one, because that’s their business model.


The tiny flaw in that plan is that Intel’s software is partly burnt into ROM inside the CPU, so it can’t change (And the same thing happens on ARM CPU’s, only on ARM it’s only the first-stage bootloaders.) And any fixed defence can, as Sun Tzu tells us, be overcome be a determined opponent. (I’ve got a jailbroken old iPad, because something about fixed defences.)

In practice, most people’s computers are vulnerable to bugs in Intel’s software that’s supposed to be just running the chip, and does not have built-in spyware or backdoors for the Americans. Of course, if Intel has been given a National Security Letter, they couldn’t disclose that.

There exist a multiplicity of attacks for the Intel bootloader, and the ARM ones at least are all different between different ARM chip vendors.

For Alternatives, RISC-V exists as an architecture, as an open-source hardware alternative to all this, but open-source, might be interesting one day, but at present, it’s slower than ARM.


Is stuff not really getting better ? It seems like my old computer was just as good as my new one?

Well, yeah, chips haven’t really got much better since 2007 or so. But don’t worry, software can get worse, and suck all the performance out of it. There’s an old computer joke “Groves gives and Gates takes.” Andy Groves was CEO of Intel at the time, as Bill Gates was of Microsoft.


The “Killer App” for all computers, Phones and Tablets, is, of course, a web browser.

Which is either Safari, Edge or Chrome from Google. Sorry, I’ll read that again, Chromium, Chromium, or Chrome. Where Chromium is the open source version of Chrome. Oh, and there’s Firefox, I suppose.

And given that most “Apps” these days are little more than webapps running semi-locally, all your other internet-facing apps are Chrome too. (A nice insecure version that doesn’t get updates) That’s not a lot of ecological variety, but at least programmers can write Javascript everywhere. Blub Blub.


(That’s a call-out to the ‘Blub paradox’ a very old essay in sour grapes from a LISP guy. LISP is an amazingly powerful ancient computer language that hardly anyone knows, but more importantly, that it almost impossible to read if you don’t already know a given LISP code-base because redefining the language in a program into a ‘Domain Specific Language’ is easy in LISP, and for LISP programmers, considered normal. That has the one tiny flaw that two LISP programmers from different software codebases can’t read each others’ programs. Sort of a built-in Tower Of Babel. But LISP comes from academic computer research in the 1960s, so readability by newbies wasn’t ever a design goal.)


You will note that at no point, do I include a call to action, or recommend a course of action.


That would be easy for me, and nobody would do it, so as they say ‘let’s not, and say I did.’


The simple solutions are simple, but economically untenable; unavoidable liability for software defects lies with manufacturers, forever. (It’s untenable because Apple has more money than the market cap of freaking Belgium – the country, not the GPD, the book value of the State. They’re not taking an economic attack lying down. For a few millions they can pay off any politician, and they make a hundred billion a year, that they admit to.)


If we had liability, people would … well Apple alone would bribe it away. You’re talking a hundred billion a year.


In my utopian dream-land, systems would be developed with techniques that were proven to be safer, but those formal methods are weird and cost more, and you have no idea how much of the global economy is a result of software. At a first approximation, 80% of it. (Turn off all the software, and see what economy there is left… I think expecting 20% to still be there is optimistic.) Software is the other oil, and the price of oil can’t go up. Not when there are levers to keep it down. Lately, of course, the billionaire class are salivating at the idea of simply using an AI to do everything. And never paying a human again. (That the current crop of AI’s are just advanced chatbots, and they are amazingly terrible programmers, is lost on non-practitioners. And there’s an underlying grift in the AI boom, as snake-oil salesmen and GPU vendors get rich from the boom.)


Anyway, what I did learn from this essay, was that ‘ecological variety’ in computing is good for resistance to viruses and malware, and the better paths, design-wise, certainly from a security point of view, are the roads less travelled.

(And that poem does not mean what many people think it means.)


There are no easy solutions. Every piece of the tech stack has been shaped by evolutionary pressures to maintain market share at all costs. (That’s every bit as terrible as it sounds, but eh – the sensible alternatives all got squeezed out.)


What’s why were have everything is a browser, and all Browsers are Chrome. Edge is Chrome, Safaris is Chrome, it’s all Chrome.


You can’t get ‘reasonable’ people to run random networked software because security.

(And there’s just enough variation between platforms to make that a nightmare for modern developers, who mostly only know javascript.) Microsoft have gone on a multi-decade campaign to make it a massive pain in the arse to make native software, in part, so they can change things, and not be beholden to not breaking API’s written in the 90s.

Apple just break APIs at random between OS updates for both Desktop (which represents such a small percentage of their revenues they barely care about it) and Mobile.

Google, who own Android, have taken to banning developers from the Google Play store, for not updating the minimum operating system version that their apps run on. They would like to obsolete some things, you see, and are stymied. Also, they get paid for sales of new devices, not happy customers.


As a nice counter-point to this, with the advent of Embedded Web browsers, they now have access to local hardware resources. Not just camera and microphones, but hardware ports.

But I’m sure it will all be fine.

That I live in a forest, offgrid, far from population centres? Coincidence.

I’m not a prepper. Preparedness isn’t my thing.

But, once upon a time, there was variety that meant something, and the entire industry wasn’t a monoculture. But much like the banana business, cloning everything has turned out okay so far.

Oh – and there’s hell to pay coming in 2038, but that’s another problem, and another essay.

To view or add a comment, sign in

Explore content categories