Modern OS: Implementation Metaphor
Does the system programming language and operating environment have to match processor/memory architecture?
When you build an operating system, you choose a lot of metaphors. For early UNIX, the PDP-11 processor architecture and C language captured an effective metaphor that the rest was molded around.
With successive processors, this metaphor held as technology changed and scaled up. But at some point, there's been a departure from this, as processor/memory architecture doesn't "map" onto the language anymore, but greatly diverge.
The VAX architecture is named as "Virtual Address eXtension" of the PDP-11, and at the University of California at Berkeley many of us were part of refining that abstraction. With 386BSD I myself stretched it still further to the X86 architecture, having done this also with National Semiconductor's 32000 architecture prior. These extensions explored slightly different advantages of each for advantage.
All of these variations on a theme had a common thread - you could map "one-to-one" and "onto" to the language metaphor, even as the C language became standardized and extended by various standards bodies.
But computer architecture has radically changed, and the last time this correspondence was true was in the early 1990's. Which interestingly is where the origins of various processor "bugs" came from. Perhaps their insignificance at the start of this divergence is by no mere coincidence, but the fact that an advantage could be had by evading the metaphor for an advantage, knowing that the means of programmatically controlling it couldn't keep up?
Since then the two have grown hopelessly out of alignment, with no means to bring a "C like" systems programming language back in to a single metaphor. What happens instead are very costly (in forcing semantics to match) and complex (proving so called "covert channel analysis") means to prove integrity. And one is certain that increasing architecture changes will follow in due course to upend this again eventually.
The benefit of restoring the metaphor is that one retains the performance (cost) and corrals the complexity, such that you can manage its growth and influence proactively rather than the past two decades of reactive damage that have chaotically happened.
Rust is a systems programming language beginning in this direction. It allows more predictable semantics to express run time expectations of function, such that you can match the architecture with a programmatic expression better. Perhaps building upon this beginning might allow a more aggressive computer language implementation technology a comprehensive "impedance match" with equally agressive processor/memory architecture?
Then the herculean task of reinventing the modern operating system with it might begin.
We need substrates and chips simple enough that they can be inspected and meaningfully vetted. To me, this means FPGAs (or some post-FPGA technology) that can be harshly inspected prior to loading a soft-core CPU with very little in the way of branch-prediction and other "optimizations" that we've found out we don't really know how to fully secure.
Architectures have shifted so much from where we were in the 70s (PDP-11/C/UNIX) and 80s (VAX/more C/better UNIX) to today. Rust is in point of fact a big step forward, but we're still jamming old paradigms into new wineskins (mixing metaphors). I'm not sure what, exactly, we need - but we don't have it, yet.