back to article Doom is 30, and so is Windows NT. How far we haven't come

As we approach the end of 2023, it's interesting to look back at the tech of three decades ago. Not just to compare it to today's, but also to that a decade earlier. The interesting aspect isn't how much has changed: it's how fast it was, and is, changing. As an Australian vulture noted recently, it is thirty years since id …

  1. Headley_Grange Silver badge

    Another thing that's changed significantly since Doom came out is that you could just bring the disc and run your games on your work PC.

    1. Anonymous Coward
      Anonymous Coward

      Hmmmm, lunch-time Quake sessions...

      1. Headley_Grange Silver badge

        I used to work in a very large open plan office and walking through it at lunch time when about half the engineers were playing Doom could bring on an attack of seasickness as the world in your peripheral vision swung about!

        1. The commentard formerly known as Mister_C Silver badge

          many many moons ago

          In our office the death match server got started at lunch o'clock and got switched off an hour later. The bosses approved as it improved their "lunch over, work started again" metric.

          1. ITMA Silver badge
            Devil

            Re: many many moons ago

            We just re-christened Quake Death Match as our prescribed "network stressing tool".

            It was more fun when two of us in the IT office didn't tell the 3rd guy we were running the Bombs add-on. LOL

        2. Boring Bob

          Seasickness

          And that was the last time I played a compter game. I got a dose of seasickness once when playing Doom with my boss on an RS232 cable. Ever since I feel sick just watching a computer game

          1. Jou (Mxyzptlk) Silver badge

            Re: Seasickness

            Play solitaire, or round based strategy games, 4x strategy games or puzzle games etc etc. It does not have to be a fast paced game like Ghostrunner or Doom and so on.

      2. Tim 11

        This was a little later - around '95 I think - I was working for a small but multinational company, having one office in the UK and another in the US. We (in the UK) used to have an after-work deathmatch with the US guys while they were on their lunch break using our E1/T1 lines. It seemed absolutely revolutionary at the time.

      3. Anonymous Coward
        Anonymous Coward

        ngl I LoLed

        You worked at Nomura too?

      4. Spanners
        Happy

        "lunch-time Quake sessions"

        I remember coming across an article on how they did this (?flood networking?). I showed the article to some of our techs to explain to me and they immediately stopped doing it.

    2. Plest Silver badge

      That cost two devs their jobs in our place!

      Back in '94 two devs did that in our place, 4 hours later they were escorted out of the building for a) playing games on work kit and b) DOOM had the worst network code imaginable and it would flood ethernet with a torrent of packets and cause entire rigs to slow down!

      We used to play it overnight in the work training rooms, we had to isolate the training room net from the rest of the company as we saw the overnight backups slow down by around 75% when we played DOOM!

  2. FatGerman

    No imagination any more

    We've lost the innovators, the people with imagination, the inventors. They've been replaced with identikit programming graduates taught to code by numbers. And a whole heap of hyoe around AI.

    That, or computer science peaked in 1993.

    1. Zippy´s Sausage Factory

      Re: No imagination any more

      I'd argue we've replaced the inventors with the money men. These days, it's 80s style greed - pump in the VC money, build up a user base, own the market and then sit back and rake in the mega money.

    2. Bearshark

      Re: No imagination any more

      I would also argue that the inventors/engineers have been stifled by the continuing drumbeat of "Policy and compliance" rules companies put on said inventors/engineers. This is cancerous toward innovation.

      1. Anonymous Coward
        Anonymous Coward

        Re: No imagination any more

        That and the amount of money swilling around creates a toxic mess full of risk averse investors that shit themselves if you dare go off piste.

        Nothing rips part of my soul out like phrases such as "Can't we just use Wordpress?" or "This framework should speed things up".

        I'm an ageing techie at this point, about to turn 40, but I've been in the space since the mid 90s (as a young lad messing around figuring things out)...and to be honest, tech has become a shadow of what it once was...in my young years (early to late teens) when I was being hired as a "whizz kid"...it was exciting, because nobody knew what they wanted and they gave you the freedom (and cash) to build what you wanted and because tech was a relatively unknown quantity, there were no "experts" to come in and audit your work, or consult on what should be done or whether you're doing it correctly (in their opinion)...because any tech was seen as an advantage and the playing field barely existed, there were no right answers...these days it's all about industry standards, what is everyone else doing? etc etc...it's all about cloning what the competition is doing and making it cheaper rather than figuring out what they aren't doing that could give you an edge...couple this with various different types of insurance that now exist and you get a boring stodgy mess.

        There are too many people in the space claiming to experts, consulting on what you should do...and not enough people just doing shit.

    3. Anonymous Coward
      Anonymous Coward

      Re: No imagination any more

      I don't think we've lost them. It's just that we want to believe innovation is always an infinite curve, going up forever, when in fact, there are both physical and biological constraints, and after a technological breakthrough, the curve always flattens.

      Looks at past breakthroughs, such as cars, at the evolution in the 2 or 3 decades from the late 19th century when they appeared, and in the past 50. Planes. Books. Shoes, even: since that last major evolution about 150 years ago, making the left and right shoe different, there's not been much going on there, Back To the Future II notwithstanding.

      1. 43300 Silver badge

        Re: No imagination any more

        Many of the same themes are seen in cars as in computers these days - make them difficult or impossible to repair (especially without specific tools / equipment / access to supply chain), this limiting their lifespan, and try to convert as many aspects as possible to subscription services rather than one-off payments.

      2. Doctor Syntax Silver badge

        Re: No imagination any more

        Unfortunately much innovation is going into busy work. We get new look buttons, web-sites which will only work on a narrow range of browsers. I can think of quite a bit of innovation that could usefully be done although some would involve unpicking a few years' worth of innovated crud.

        1. 43300 Silver badge

          Re: No imagination any more

          That's true, and the busy work on the part of software companies ends up creating unnecessary work for others! e.g. I've spent part of this afternoon creating and testing a GPO to hide the 'try new Outook' toggle in Outlook: the new Outlook is (despite Microsoft's claims) totally inadequate for business use, with various important functionality entirely missing.

          Microsoft is one of the worst for this - regularly trying to foist shit which nobody asked for on us, and making it a load of hassle (or imposible) to block it effectively.

          1. the spectacularly refined chap Silver badge

            Re: No imagination any more

            I've had that recently only with MS Teams. I ran the new version, to be honest I didn't really notice any difference. I switched back to Classic because I got fed up of the "You've been using New Teams, do you want to keep it.." nag box coming up at startup each and every time. The old version has no such foolishness.

            1. Jou (Mxyzptlk) Silver badge

              Re: No imagination any more

              But why? You missed out about seven other "this new feature" information popups! I am currently collecting them to recreate the "Airplane Fight Scene" in an endless loop with tons of those "helpful hints". When I get around the time to create it...

            2. 43300 Silver badge

              Re: No imagination any more

              I've done the opposite with Teams - set the new one as the default for all users. I've been using it for months and it seems fine (unlike the new Outlook) - barely any different from the old version apart from starting up quicker, presumably due to its pre-loaded Edge underpinnings. Microsoft is intending to enforce it in the next few months anyway, so I thought I may as well set the default now and stop the nagging to 'try the new Teams'.

              The issue with the nag box in the other direction is mostly down to old shortcuts - the two versions have separate entries on start menu.

              1. 42656e4d203239 Silver badge

                Re: No imagination any more

                The worst bit about 'old' teams was that it was installed into every user's profile which, on shared PCs is a bit of a pain. Even using the "system wide installer" justy bodged the install so that each user didn't have to do it themselves.

                At least Microsoft seem to have noticed that "per user" installation only works for single user machines... given Windows is, at least notionally, "many users on one device", the "every user has to install this crapware from the Microsoft Store" thing may now die a death. I can always dream!

            3. ridley

              Re: No imagination any more

              Didn't notice a difference?

              I did, the new one didn't let me change the team name, no doubt it does and has hidden it away somewhere but life is too short.

              1. The Indomitable Gall

                Re: No imagination any more

                I suspect that's deliberate, as people were probably changing Team names for their own reference without realising they were changing something on the SharePoint configuration for all team members. I can imagine a few self-defined "clever" wags renaming teams to "*snore*" or "timewasting corporate comms" and getting hauled up for it in the before-times.

            4. Spanners
              Meh

              Re: No imagination any more

              I get the offer, whichever one I am using.

              If I use the imperceptibly new version I get asked if I want to switch back and vice versa.

          2. Richard 12 Silver badge
            Mushroom

            Re: No imagination any more

            Apple is however the absolute worst.

            About 90% of the Mac work I do is to keep up with Apple's insistence on obsolescence.

            A few years ago I thought it was because they wanted to kill the Mac. Then they killed x86-64 and switched to ARM64, so I wasn't totally wrong.

            I would mind less if they had usable documentation, but...

            1. Tim99 Silver badge
              Coat

              Re: No imagination any more

              Some of us find "man" (-k) more than adequate…

              Mine’s the one with ”UNIX Power Utilities" in the pocket >>======>

              1. Richard 12 Silver badge

                Re: No imagination any more

                man is only as good as the content.

                Nearly all Apple docs are essentially "flob (bool): enables flobbing".

                The main exception is the Unix/BSD utilities and APIs they inherited, which have pretty good man pages. Until you look at the Apple extensions, anyway.

        2. veti Silver badge

          Re: No imagination any more

          Circa 1990, computing was still newish, and exciting. People went into it because they were keen. They had ideas, they had visions, they had love of the subject matter.

          Over the next 30 years, computing became ever more mainstream. Vastly more people went into it, not because they were visionary or excited by it, but because it promised a good, steady paycheck. Those people - by now millions of them, all over the world - want someone to tell them what to do, then they'll do it, and pocket the money and go home.

          When you get a large number of those sorts of people in an industry, it changes. They become customers of a sort, and the industry looks after their needs by creating busywork for them to do.

        3. NopetyNope

          Re: No imagination any more

          Advertising is what killed it. The best and brightest of the current era get paid silly money to make adverts appear ever so slightly faster now.

        4. C R Mudgeon

          Re: No imagination any more

          "much innovation is going into busy work. We get new look buttons..."

          I see that as a symptom. Genuine innovation has slowed way down, but the marketing-driven need for "new and improved" hasn't abated, so people tinker with what's already working, not because it needs tinkering, but because that's what they're being paid for. The result: the ***NEW!*** is all too seldom actually an improvement.

          Too many people fail to understand the concept of: if it ain't broke, don't fix it.

          I'm struggling with that right now; I keep wanting to futz with this post :-/

      3. C R Mudgeon

        Re: No imagination any more

        "the curve always flattens"

        I'd argue that that's not just a human/technological phenomenon. Think of the Cambrian explosion and other similar bursts of evolutionary diversification, always followed by a settling-down.

    4. Version 1.0 Silver badge
      Holmes

      Re: No imagination any more

      Computer science peaked in 1993.

      An environment very much described earlier by Alan Cooper, the "Father of Visual Basic" - “It has been said that the great scientific disciplines are examples of giants standing on the shoulders of other giants. It has also been said that the software industry is an example of midgets standing on the toes of other midgets.”

      1. Anonymous Coward
        Anonymous Coward

        Re: No imagination any more

        Indeed - an inordinate amount of time and effort seems to be spent reinventing - but shinier of course - what we already had.

        The problem with computer science is that (a) it is not really a science - a discipline at most (b) even though it's not a science the majority of people in computing have never studied it.

        And to add insult to injury, whereas other disciplines respect their past and learn from it, computing just seems to dismiss it as "old" and stick its fingers in its ears - one day it will stop being a teenager and join the rest of science and engineering .... but I'll no doubt be dead by then lol

        1. Richard 12 Silver badge

          Re: No imagination any more

          Many of the people I have worked with who studied Computer Science are "architecture astronauts", designing something that cannot be built or that real humans cannot use.

          Often both.

          Yes, it is elegant and covers every possible use case. But in the real world, there's budgets, timescales, and icky humans with their own internal models of how the system works.

          Really, the hard work is deciding what it will not do.

          1. Crypto Monad Silver badge

            Re: No imagination any more

            The hard work is deciding what it will do in *unexpected situations*, which occur all the time, and testing all those scenarios.

            1. Nick Ryan

              Re: No imagination any more

              Ah, but testing and error checking is just something that old timers do. It's a waste of time, pointless and takes skill and effort to do. Error checking should always happen with exceptions being there as a last resort to pick up the exceptional scenarios. It's even in the name...

              Take three examples of an algorithm to open a door:

              1a. Open the door. Exception occurred: door failed to open. Kick this exception up the call stack.

              1b. Open the door. The door was already open. Continue with the next step.

              2a. Open the door. Exception occurred: door failed to open. Kick this exception up the call stack.

              2b. Open the door. The door is locked. Inform the user that the door is locked and that they need a key to open it.

              3a. Open the door. Exception occurred: door failed to open. Kick this exception up the call stack.

              3b. Open the door. Exception occurred: door failed to open for no reasonably expected reason. Kick this exception up the call stack.

              In scenario 1a, 2a and 3a a lazy, poor quality developer just raises an exception regardless. Note how the result is the same and is of no help whatsoever.

              In scenario 1b and 2b error handling happens and is useful. In 1b the process could continue without failing, in 2b the process could not continue but the user is informed as to why. In scenario 3 the door fell apart when trying to open it and raising an exception in this case is quite reasonable and in 3b most scenarios the door falling apart is not an expected outcome, but at least we know that it was not due to the door being already open or being locked.

              1. Bill 21

                Re: No imagination any more

                You missed "open the door and assume it did", which is the one I've encountered most times in the last 40-odd years.

                1. Nick Ryan

                  Re: No imagination any more

                  True. Particularly annoying because then the exception would then occur on the next operation and have little real relevance to it.

                  Adding in sensible error handling takes time and effort, and testing, and is a little tedious but the value of putting it in place is so high. Instead we now have trash software that just fails for no discernible reason and no real way to find out what the cause was.

      2. Michael Wojcik Silver badge

        Re: No imagination any more

        Computer Science and Software Development are very different disciplines, just as physics and automotive engineering are. I've yet to see a comment in this thread complaining about "computer science" which demonstrates any actual awareness of the field.

        1. bombastic bob Silver badge
          Devil

          Re: No imagination any more

          comoputer science: theory

          software engineering: practical application

          1. Anonymous Coward
            Anonymous Coward

            Re: No imagination any more

            software "engineering" is currently mostly a myth ...

            engineering implies discipline, best practice, acknowledged techniques and approaches, continuity, consistency, knowledge transfer, costing and estimation.... to name but a a few essential aspects.

            in the main software development is not close to this, there are pockets but they are few and far between

    5. John Brown (no body) Silver badge

      Re: No imagination any more

      "They've been replaced with identikit programming graduates taught to code by numbers."

      That's true. But, as systems grow and evolve, they become ever more complex to the extent that no one person can fully understand the underlying hardware and software so you need teams of people working on new projects and developments, and almost by definition, that means corporate involvement and all that comes with it. Arduinos, Arm, Raspberry Pi and similar are where the "bedroom developers" of old are playing these days, but they are competing with the corporate world and their big marketing budgets. It's a "mature" industry now, not a bunch of hobbyists coming up with brand new ideas in a the vacuum of the nascent computer industry of the 70's and 80's. Someone upthread mentioned cars and shoes. Same thing. It's all "been done" with very little room left for true innovation, just minor tweaks and improvements.

      1. Crypto Monad Silver badge

        Re: No imagination any more

        A Raspberry Pi, with gigabytes of SSD and gigabytes of RAM, is not something a bedroom developer can understand top-to-bottom any more. You have to trust multiple layers underneath you.

        My first computer had a Motorola 6800. The "monitor" that launched at bootup was in 1KB of EPROM. *That* I could read and understand top to bottom.

        Next computer was Commodore 64 with a 6502. That processor had around 4,000 transistors. That is small enough that you can actually replicate it out of discrete components: monster6502.com

        1. Anonymous Coward
          Anonymous Coward

          Re: No imagination any more

          My first was a ZX80 kit which came with a circuit diagram - no custom chips, I could understand everything, and the 1KB firmware ROM code was available too.

        2. david 12 Silver badge

          Re: No imagination any more

          not something a bedroom developer can understand top-to-bottom

          It doesn't even boot to a user OS. It boots to a supervisory OS running in a separate processor, which then loads the linux-derived user OS as an application.

          1. Jou (Mxyzptlk) Silver badge

            Re: No imagination any more

            > It doesn't even boot to a user OS. It boots to a supervisory OS running in a separate processor, which then loads the linux-derived user OS as an application.

            What you describe sounds like what every type-1 hypervisor is doing since ever those CPU commands got available. Some can do that since the 1960s. Whether one of those many linuxes, vmware, windows, mac etc etc. They all use the same technique: Using Ring -1 (that naming is x86 specific though) for the hypervisor, and what we "see" as host-OS is actually the first VM which coordinates talking to the rest of the hardware with its drivers. Every subsequent VM runs in parallel with the host-VM, unless you use nested virtualization.

            1. C R Mudgeon

              Re: No imagination any more

              The "ring" terminology, like the concept of having more than the basic privileged/unprivileged two of them, comes from Multics IIRC.

        3. John Brown (no body) Silver badge

          Re: No imagination any more

          "A Raspberry Pi, with gigabytes of SSD and gigabytes of RAM, is not something a bedroom developer can understand top-to-bottom any more. You have to trust multiple layers underneath you."

          Good comment, that got me thinking and searching the interwebs! I was vaguely aware of this bare metal 68000 emulator and this bare metal Amiga emulator, but a quick search shows a number of projects and tutorials on writing "bare metal" on the Pi and accessing the onboard hardware. Sure, it's not true bare metal when some of the hardware is effectively binary blobs, but it's in the right spirit. It may be surprising to many just how much documentation is actually out there to really play properly with a Pi.

          But yes, it's still not quite the same as fully understanding the entire system. Although in defence, there are still new things being discovered and done on the old 8 and 16 bit kit that we thought was fully understood decades ago, if you delve deeper into the retro scene! Clever screen displays on a Commodore PET, and an original IBM PC with original CGA graphics card not only getting 16 colours in 320x200, something previously thought impossible, but even using timing to simulate the missing horizontal retrace register to switch screen modes part way down the screen! (Yes, even the PC with CGA or even EGA was relatively easy to fully understand, both hardware and software, back then :-)) I did some bare metal programming of the CGA card back in the day, even independently "discovering" the 160x100 16 colour more (and the same mode for 80x100x136 colours) but this recent demo knocked my socks off!!

    6. bombastic bob Silver badge
      Terminator

      Re: No imagination any more

      "They've been replaced with identikit programming graduates taught to code by numbers."

      In short, the indie developers have been replaced with "sweat shop" operations. Micros~1 is one of those.

      Look at what happened to Windows starting with 8. All appearance of artistry and common sense VANISHED from the UI. Micros~1 went with re-arranging and re-designing and taking AWAY functionality that had been there since 1993 (like personalization). It's all "minimalistic" 2D FLATTY FLATSO FLATASS with "Settings" instead of Control Panel, "CRapps" with ADVERTISING built in that you download from "The Store", a cloudy login that I am *CERTAIN* helps them identify your computer while you are on the internet, FORCED "UP"grades and the BSOD's that come with them, and so on.

      The entire ATTITUDE of innovation is *DEAD*, at least at Micros~1. It's more like THE BORG now. USERS WILL COMPLY. YOU WILL BE ASSIMILATED. etc.

      At least Apple products use shadowing. If they must draw everything in the UI all 2D FLATASS like Windows 1.0, at LEAST put a shadow under it to make it TRY to look nice.

      [they should have just fixed all of the bugs, and streamlined the existing code base]

    7. Anonymous Coward
      Anonymous Coward

      Re: No imagination any more

      No, we're still here, quite a lot of us in the AI space these days, because it's still the wild west and we're still figuring it all out...but the academics normalised academic tech qualifications in the profession, so it's become a lot easier to hire the homogenised dross off the university factory production line.

      Happens with every industry...

      The innovators come along and start something new and exciting...then you get Ray Crocs coming along, who take the concept, dumb it down add some mass market appeal...then the universities come along, find a way to homogenise everything and they start producing chicken nuggets at scale for the franchises to put in their many branches.

      Usually by the time academic qualifications are established, we're at the late stage of innovation...most of the work to innovation has been done, refinements have been made and we start spitting warehouses full of people that operate the tools and inventions and we call them skilled professionals.

      Web design is a good example of this. We went from the fledgling internet, where building websites was incredibly manual (hand coded, manual FTP upload, tight storage space, tight parameters to work with, really annoying cross browser compatibility, bandwidth considerations etc etc), incredibly time consuming and required precision (because a typo could ruin your day)...to Wix websites, Wordpress, code completion, frameworks, basically unlimited storage, one click deployments, browser standardisation (mostly), etc...websites may have been simple back in the day, but it took a lot to get there...these days, even though websites are a lot more sophisticated and complex, the tools that exist that were built on top of the early days discovery, make these sites relatively easier to produce and far less time consuming...what I can whip up in a couple of hours now, would have taken me weeks or months 30 years ago...if they were even possible.

      I used to look after the site of a certain famous art gallery in London back in the day. It was all static HTML and they had thousands of catalogued paintings on the site that you could view. Adding a new painting to the site took about 3 days, because you had to go through each and every page that required a link to that painting and add the links manually, you also had to create a new page largely from scratch...then you would have to check for broken links, which meant after you'd uploaded everything, you had to painstakingly go through the whole site, following the new links, checking the links on the new page etc etc...

      I did eventually build some tools to make this process a lot easier (in VB5 no less!)...these days though, if I wanted a job in web design I wouldn't get a look in as I don't have any qualifications in the space...despite the fact that I am just as capable (these modern frameworks are a piece of piss for someone that used to code vanilla Javascript without a search engine and Stack Overflow)...some of the sites I built are still viewable online (through archive.org)...but I can't prove that I built them...since I have no qualifications (I am too old to fuck around with a 3-4 year degree course and I have a family now, if I could just pay to do the exams or some kind of intense fast track course, I would, but you can't, which sucks)...I therefore have to accept that weirdly people with loads of qualifications and very little experience are easier to hire than an old dog like me with no real qualifications and decades of experience...it's not all bad though, because I still innovate to stay relevant...as long as I'm exploring the new areas before the squeamish and risk-averse youngsters do...I will continue to be relevant and make bank before they hit the new sectors!

      AI is great right now, there are no proper qualifications in the space, it's still a bit rough and ready and people willing to step into the unknown and experiment with building solutions can charge a good amount for their time...give it about 3-4 years though and we'll be in "one click, deploy your AI now, supercharge your business for $19.99 a month!" territory...it'll be yet another £20k a year junior role once the kinks have been ironed out, the processes standardised and solutions have been packaged.

      Thing is, you can't infinitely innovate, the steps get smaller over time and the ground you can break shrinks...so the innovators move on to larger, untouched grounds with plenty of space to frolic and make messes.

    8. Plest Silver badge

      Re: No imagination any more

      Javascript peaked after React and every year since all we get is a new "revolutionary" Javascript lib every year! Progress is dead in that language.

  3. Kurgan

    Computer did get faster, software did get bloated.

    Computers did actually get faster and cheaper but the whole ecosystem grinded to a bloated halt. Software is poorly written, incredibly bloated, full of useless functions, made to help software and hardware sellers make more money and NOT made to help users do their jobs. Everything is a poorly written spyware full of DRM. Malfunctioning, slow, defective, made to be obsolete tomorrow morning. We are not going to experience another real revolution until the AI actually takes over and wipes us.

    1. KittenHuffer Silver badge
      Terminator

      Re: Computer did get faster, software did get bloated.

      I for one will welcome our new AI overlords!

      At least when they are wiping me out I will know for sure that it is intentional. The wiping out that our present overlords perform seems to be much more to do with incompetence.

      1. vtcodger Silver badge

        Re: Computer did get faster, software did get bloated.

        "I for one will welcome our new AI overlords!"

        Ahem ... How do you plan to do that? Looks to me like one very likely result of the overenthusiastic embrace of AI will be that effective communication. with or via computers will become next to impossible.

        For example your "Hail Overlords" message will likely be massaged and come out on the overlord end as "What is the hail threat today in Overland Park, Kansas?" Or maybe as a challenge to take them on in a cage fight.

        1. Yankee Doodle Doofus Bronze badge

          Re: Overland Park

          Strange to see a reference to Overland Park on El Reg. Are there people who don't live within a few hundred miles of it who have even heard of it? I'm guessing not many...

          1. sjb2016reg

            Re: Overland Park

            I've been to Overland Park, but only because my mother grew up in Southern Missouri and then moved to Kansas City. So we used to visit her friends in and around Kansas City (both in KS and MO). I was born and raised in Central NY, so more than a few hundred miles away from Overland Park, KS. But now I'm living the dream in Bedfordshire, UK!

          2. PRR Silver badge
            Gimp

            Re: Overland Park

            > reference to Overland Park [KS] on El Reg. Are there people who don't live within a few hundred miles of it.....?

            Why sure; but I was born at Columbia Mizzou {130 miles} when Overland Park was a wee suburb of KC.

            FWIW there's been some incidents in recent OP that might have made national note.

    2. Wally Dug
      FAIL

      Word 6

      Obviously this is based on my fading memory, but I think that Word 6 will have most/all features that I use in Word to this day. And how many different versions have there been since then? So, yes, bloated software full of "features" that may benefit a very, very small percentage of people only, yet is programmed into all copies of Word/Office out there.

      And don't get me started on the requirement for an Internet connection to play MS Solitaire now, for both the scores and those annoying ads! (Yes, I could buy an ad-free version, but why should I? Solitaire has been free in every version of Windows up to 10, so why change it?)

      1. Geoff Campbell Silver badge
        Facepalm

        Re: "Yes, I could buy an ad-free version, but why should I?"

        And that, right there, is why innovation has stalled on the desktop. Nobody want to pay for software.

        GJC

        1. theOtherJT Silver badge

          Re: "Yes, I could buy an ad-free version, but why should I?"

          Nobody wants to pay for software that they already had again I think you'll find.

          We resent being asked to buy a new version of something every couple of years (minimum!) because that's what companies want us to do, when the one we have already works just fine, thank you. That's doubly true when the "new" software is actually measurably worse than the thing it replaces - being either stuffed full of adverts, stuffed full of spyware, running slower on the same hardware, demanding an internet connection for something it has no need to have an internet connection for, and in increasingly regular cases all of the above!

          1. StrangerHereMyself Silver badge

            Re: "Yes, I could buy an ad-free version, but why should I?"

            Microsoft is rumored to be turning Windows into a cloud-only offering. I believe it will fail miserably, but the corporate egg-heads just won't let go of the idea and moving forward nonetheless.

            1. David 132 Silver badge
              Big Brother

              Re: "Yes, I could buy an ad-free version, but why should I?"

              "If you want a picture of the future, imagine a boot stamping on a human face a credit card being debited for recurring subscriptions... forever"

              1. C R Mudgeon

                Re: "Yes, I could buy an ad-free version, but why should I?"

                2 + 2 = 5

                Fixed in version 3.0 -- on sale for only X dollars.

            2. ecofeco Silver badge

              Re: "Yes, I could buy an ad-free version, but why should I?"

              This is not a rumor. All one has to do is follow the news of the last few years to see this is obviously where they are headed. We're almost there right now.

              1. StrangerHereMyself Silver badge

                Re: "Yes, I could buy an ad-free version, but why should I?"

                As of yet they're not stamping out consumers who don't want this, but this could change in the future.

            3. 43300 Silver badge

              Re: "Yes, I could buy an ad-free version, but why should I?"

              Yes, indeed - see Microsoft 365 for the clearest example. Log into a Windows VM which you pay for monthly from your ...er... Windows computer. A fine example of a solution looking for a problem, and unlike VDI systems such as Azure Virtual Desktop it has no economies of scale and is harder to manage.

            4. HKmk23

              Re: "Yes, I could buy an ad-free version, but why should I?" and windows is going to be cloud only..

              Great news, then people will be encouraged/forced to develop a true windows replacement.

          2. ianbetteridge

            Re: "Yes, I could buy an ad-free version, but why should I?"

            You do remember the days when every update to Windows and before that DOS was paid-for, right?

            1. cookieMonster Silver badge

              Re: "Yes, I could buy an ad-free version, but why should I?"

              Yeah, I do and fucked if I actually know anyone who did. Probably the reason the entire fucking world is pushing us to subscriptions for everything, because no-one is fucking interested in the majority of the shit being sold

            2. StrangerHereMyself Silver badge

              Re: "Yes, I could buy an ad-free version, but why should I?"

              You mean version 2.0 of version 2.01? I guess they figured out early no one is going to pay for bug fixes.

              Anyway we're still paying for updates through our noses. We just don't see them because the price is included when we buy a laptop or PC.

          3. jdiebdhidbsusbvwbsidnsoskebid Silver badge

            Re: "Yes, I could buy an ad-free version, but why should I?"

            "Nobody wants to pay for software that they already had again"

            Couldn't agree more. I've never updated anything on my laptop or my phone because I wanted to, except maybe when I moved from DOS on a 286 to (probably) win 3.11 on a 486. And that's the point of the article - how little real progress has been made since then.

            Phones have now gone the same way, not surprisingly since really a phone is just a computer and the same constraints are coming back to bite us. The act of software development has changed a lot, it's so much easier now to create complex software than it was 30 years ago with all the tools now available. But a consequence of that is all that software has to run on a standard operating system built to try and account for every possible use case so it's all so constrained. There is no equivalent today to writing 8 bit machine code in your bedroom, where you could do anything the actual hardware was capable of. Yes the hardware had limits, but at least you could program up to those limits. Now you have to go through a bloated operating system instead that most of the time feels like it's designed to physically stop you using the limits of the hardware.

            It feels like we have hit the buffers of the "pile it high, sell it cheap" model of IT. Good quality specialist software still exists but it takes to much specialist skill, time and money for it to be for the masses.

            1. ecofeco Silver badge

              Re: "Yes, I could buy an ad-free version, but why should I?"

              It feels like we have hit the buffers of the "pile it high, sell it cheap" model of IT.

              We have.

          4. Doctor Tarr

            Re: "Yes, I could buy an ad-free version, but why should I?"

            You know that *car you bought new 2 years ago? Well, we’ve decided to launch a new model in three months time you’ll need to buy the new one. And, before you ask sir, the old one will be decommissioned as it’s no longer supported by our service network.

            *exchange for house, TV, washing machine,etc.

          5. 43300 Silver badge

            Re: "Yes, I could buy an ad-free version, but why should I?"

            "We resent being asked to buy a new version of something every couple of years (minimum!) because that's what companies want us to do"

            That's what they wanted us to do until fairly recent years - these days they want to sell us a subscription so that they can charge us every month, with the totals coming to far more than the one-off purchases every few years.

            Meanwhile, this also results in them moving to continuous development, so instead of getting a new version every few years and bug fixes in between, we now get new features regularly - mostly things which nobody wants, which often don't work properly, and which not infrequently break or complicate core functionality...

            1. theOtherJT Silver badge

              Re: "Yes, I could buy an ad-free version, but why should I?"

              The way I see it that's being asked to buy a new version (because I've not seen anything on subscription that's stable enough not to get weekly point releases since no one bothers to actually test that things work before releasing them any more) every month. It's the same thing just turned up to 11.

        2. Wally Dug

          Re: "Yes, I could buy an ad-free version, but why should I?"

          Nobody want to pay for software.

          Actually, I already "bought" MS Solitaire when I bought my copy of Windows 10 as it comes with it, just like it did with every other version of Windows that I bought. Would you like to see the receipts?

          1. Geoff Campbell Silver badge

            Re: "Yes, I could buy an ad-free version, but why should I?"

            And that copy that you bought will carry on working, forever. What's the problem?

            GJC

            1. Wally Dug

              Re: "Yes, I could buy an ad-free version, but why should I?"

              The "problem" is as mentioned in my original post:

              "...the requirement for an Internet connection to play MS Solitaire now, for both the scores and those annoying ads!"

              Previous versions of MS Solitaire did not contain ads and did not require an Internet connection, so we're back to bloat and unwanted/unnecessary features, as per the OP.

              When you say "And that copy that you bought will carry on working, forever" then, yes, technically you are correct. But it will still contain ads, still require an Internet connection unlike previous versions and at some point in the future (14 October 2025 in the case of Windows 10 Home) Microsoft will stop supporting it. So technically, yes; realistically, no.

              1. Geoff Campbell Silver badge
                Windows

                Re: "Yes, I could buy an ad-free version, but why should I?"

                So you want support, and presumably updates, forever, for free? That takes programming time and effort. Do you give your work away for free?

                GJC

                1. Doctor Syntax Silver badge

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  Those who have paid for software could reasonably take the view that (a) they paid for something that they expected to work so if updates are needed to accomplish that then they have already been paid for and (b) if it provides the functionality they require already they don't want the bloat.

                  However Microsoft has past form in changing file formats to be non-backwards compatible. It could also be argued that the updates to the installed base needed to read those should be covered by (a) above although, of course, we know that the intent was to force customers to re-buy what they'd already bought.

                2. Yankee Doodle Doofus Bronze badge

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  Support and updates... to solitaire? Porting to 64 bits should have been the last time an update was needed.

                3. Wally Dug

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  With all due respect to you GJC, at no point did I say that I wanted "support, and presumably updates, forever, for free". My remarks about support were simply to highlight that when Microsoft withdraws support from Windows (10 Home in 2025, in my example) then support for MS Solitaire, by extension, will also end. In fact, I agreed with your comment that the software "will carry on working, forever", although it may not be supported forever. Likewise, I also agree with a later post by Yankee Doodle Doofus: "Support and updates... to solitaire? Porting to 64 bits should have been the last time an update was needed."

                  Judging by all my upvotes compared to your downvotes across all posts on this topic, the majority of readers understand my original point which was to agree about the scourge of bloatware and unnecessary features and I used MS Solitaire as an example as now it needs an Internet connection to keep your score and it also feeds you ads whereas previous - and recent - versions did not.

                  And as for your last question: "Do you give your work away for free?" Yes, in the past I have. I admit that I am also a salaried employee within the IT industry, but I have also carried out IT-related work in the past for people for no cost, even supplying them with (admittedly older) equipment that was going spare. May I ask if that has answered your question?

                  1. Geoff Campbell Silver badge
                    Pint

                    Re: "Yes, I could buy an ad-free version, but why should I?"

                    I, too, give a lot of my work away for free - I won't list it here, because special pleading is very unattractive. But I never expect anyone else to do the same, otherwise I'm not giving my work away for free, I'm bartering for time, and that's something rather different.

                    Appealing to popular applause is also rather unattractive. I've been around these parts for years, and I know that stating any kind of approval for Microsoft will get downvoted into the stone-age. But, as I say, I care not. I am old enough and ugly enough to choose my suppliers based on what I need and what they provide, not on what popular opinion tells me I should use. (But I should point out that I currently own and manage more machines running various forms of Linux than those running Windows, for the record).

                    So, yeah, it answers whatever my question might have been. Funny ol' life, isn't it?

                    GJC

                4. The commentard formerly known as Mister_C Silver badge

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  "So you want support, and presumably updates, forever, for free?"

                  I expect it to work, just like I expect everything I buy to be FIT FOR PURPOSE when the vendor sends it to market. If it doesn't then I expect the vendor to fix it FOR FREE, just like the products that I'm involved in the manufacture of need to work when we sell them.

                  Updates are a different story. An update that improves the product is chargeable, changing the (working) interface to a grey-on-grey ribbon or adding a feature that nobody needs is another case altogether.

                5. CGBS

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  Do I work for free...by free do you mean do most people go into the homes and cars and places of others, attach tracking devices and cameras to everything and then sell that on the open market just because the person paid for a dumb ass phone game? Is that what you mean by free? That doesn't sound free. That sounds very expensive.

            2. rajivdx

              Re: "Yes, I could buy an ad-free version, but why should I?"

              Nope it wont.

              Microsoft even said Windows 10 will be the last version of Windows ever - and then they went back on their promise.

              Now Windows 10 nags you to update to Windows 11 and threatens to go end of life soon.

              Windows 11 is the most pointless OS ever, I don't know why it is needed. They have crippled and removed so many features and have gone back to the Windows 7 feature set pretty much.

              1. John Brown (no body) Silver badge
                Coat

                Re: "Yes, I could buy an ad-free version, but why should I?"

                "They have crippled and removed so many features and have gone back to the Windows 7 feature set pretty much."

                But, but, but...isn't going back to Windows 7 what so many posters said they wanted? :-)

                1. Doctor Syntax Silver badge

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  I doubt they'll really take it back to the W7 feature set - no expectation of being online, no ads, no subscriptions....

              2. legless82

                Re: "Yes, I could buy an ad-free version, but why should I?"

                Microsoft even said Windows 10 will be the last version of Windows ever - and then they went back on their promise.

                Except they didn't actually say that...

                1. Richard 12 Silver badge

                  Re: "Yes, I could buy an ad-free version, but why should I?"

                  They did say that.

                  Right now we're releasing Windows 10, and because Windows 10 is the last version of Windows, we're all still working on Windows 10 ... It's all about Windows as a service

                  Jerry Nixon, Microsoft speaker at Ignite 2015.

                  I didn't believe them, but it is the actual words that were signed off by Microsoft management to be spoken at a Microsoft conference.

        3. ianbetteridge

          Re: "Yes, I could buy an ad-free version, but why should I?"

          Yep. And you would have to be pretty unaware of the entire history of Microsoft, dating back to the "Open Letter to Hobbyists", not to expect the company to want to charge for anything and everything it makes.

        4. bombastic bob Silver badge
          Unhappy

          Re: "Yes, I could buy an ad-free version, but why should I?"

          You are close but not quite correct.

          It is mre like this: Micros~1 has not produced an OS that people actually WANT to purchase to upgrade their computer systems since Windows 7.

          So they came up with a plan to LOCK YOU IN and then "drop support" and get MAJOR SOFTWARE MAKERS (right, Intuit?) to NOT support running on the older systems, THUS forcing you to "UP"grade your computer (or get a new one) if you JUST want to do your taxes this year.

          Pretty much THAT. SUCKS, doesn't it?

      2. vtcodger Silver badge

        Re: Word 6

        Come to that, why Word at all? In the late 1980s, we were running offices with no huge problems using text based Word Perfect and Lotus123. For most people most of the time, they worked perfectly well. They may well have been not only as good at actually getting work done -- which is, I think, most likely the point of PCs -- as their modern GUI equivalents. Maybe better.

        1. This post has been deleted by its author

        2. tfewster
          Facepalm

          Re: Word 6

          Don't be naive. Want WYSIWIG? Graphics and print drivers. Want a mouse or USB devices? Drivers.

          Windows has been a great platform for unifying and abstracting apps from the hardware level. And credit to Microsoft/Windows/MS Office for standardising control keys and options.

          But back to the topic of the article - what has Microsoft done for us since then?

          (Commence corrections & downvotes in ...3.2.1...)

          1. none_of_your_business

            Re: Word 6

            Not going to downvote! But pedant hat on...

            Standardised control keys and options, which Windows follows, originated back with the Apple Lisa and later there was an attempt to standardise these under the auspices of IBM called the "common user access" or CUA guidelines which was intended to make working in applications across all flavours of computer system consistent. Microsoft was no doubt involved and paid heed to these but I wouldn't necessarily credit them with it; it was an industry wide thing.

            1. Liam Proven (Written by Reg staff) Silver badge

              Re: Word 6

              [Author here]

              > the auspices of IBM called the "common user access" or CUA guidelines

              Indeed so. In recent weeks I have been working on tracing the original IBM documentation, finding how to convert EBDIC-coded .BOO files into PDFs, and making sure that there are readable copies online in multiple places.

              With some help from the good burghers of the ClassicCmp mailing lists, I've put copies of the 2 main guides online.

              Object−Oriented CUA Interface Design:

              https://www.scribd.com/document/691545759/IBM-Object-Oriented-CUA-Interface-Design-f29al000

              CUA Basic Interface Design Guide:

              https://www.scribd.com/document/693329404/IBM-SAA-CUA-Basic-Interface-Design-Guide

              Don't have or want a Scribd account? They're on the Internet Archive too:

              Object−Oriented CUA Interface Design:

              https://archive.org/download/sc34-4399-00/Object%E2%88%92Oriented%20Interface%20Design%20-%20SC34%E2%88%924399%E2%88%9200%20-%20f29al000.pdf

              CUA Basic Interface Design Guide:

              https://archive.org/download/ibm-saa-cua-basic-interface-design-guide/IBM%20SAA%20-%20CUA%20Basic%20Interface%20Design%20Guide.pdf

              This now only leaves me with the not insubstantial task of reading some 600pp of very dry 1980s mainframe documentation. :-/

              1. yetanotheraoc Silver badge

                Re: Word 6

                That archive.org first link gave me "authorization required". But I am reading it here:

                https://archive.org/details/sc34-4399-00/mode/1up

        3. Liam Proven (Written by Reg staff) Silver badge

          Re: Word 6

          [Author here]

          > text based Word Perfect and Lotus123.

          Very true. And in 1992 I took a proficiency exam in both, for a City bank, and got a record-breaking score. 99% I think -- from memory, because I added redundant extra formatting to put a graphic at top-right in WP5.1 when that was the default position.

          I got the job.

          You know what? I would choose Word 6 over either of them. They were fast, efficient, and frankly fairly horrible tools to use, partly because of their horrid nonstandard (or more accurately pre-standard) UIs.

          Whereas Word 6 for DOS does all I need and more and I'd happily use it... but getting stuff in and out of it is a pain. So I tried Word 6 for Windows... but it's a 16-bit app, and so doesn't understand long filenames and modern Windows can't run it at all. So after an epic hunt, I located Word 6 for NT on a Russian warez site. It's 32-bit, works on Win7 and above in 64-bit form, but it's still a ~1993 app: no scroll mouse support, no proportional scrollbar thumbs, Win3 file selector dialog boxes. Generally pretty clunky by 21st century standards.

          So, I tried Word 95. Works great, integrates well in modern Windows, but it uses the DOS MS Word .DOC file format that nothing much understands any more.

          So, for years, I used Word 97. It's barely any bigger, uses the same file format that worked up to 2003 and which everything supports, and it's tiny and blisteringly fast on modern computers.

          I recount all this as an illustration of the problems of trying to skip 4-5 technological generations. It's not as easy as it sounds.

          1. Jou (Mxyzptlk) Silver badge

            Re: Word 6

            > So, for years, I used Word 97.

            Which still works in Windows 11. Though I installed it on Win2k VM directly on c:\Office97, knowing it would never be able to get around UAC and the like without much work. And then transported it over to Windows 11 with the regkeys exported so it knows where it is "installed".

      3. MyffyW Silver badge

        Re: Word 6

        I did my entire final year design project on Word 2. And the output still has the crisp perfection of New Orders Substance album cover.

      4. Anonymous Coward
        Anonymous Coward

        Re: Word 6

        The thing is, 90% (*) of the work that 90% (*) of the people do today, could be done on a 486 running Win3.11. in 1993.

        Word processing, spreadsheet, reading news websites, playing music, even photo editing and manipulation. Maybe rendering HD video would be a bit difficult.......

        Very little of that could have been done 10 years earlier with a C64 or BBC-B. It would have needed something really meaty. PDP-11?

        As others comment, we seem to have stagnated with innovation in the past 30 years and just opened ourselves up to being spied on, sold, and tracked 24/7.

        * 90% of statistics are made up on the spot.

        1. Zolko Silver badge

          Re: Word 6

          Word 6 running on Windows 3.1 on 486 (DX ! the luxury one with the floating point co-processor, not the SX variant for loosers) is all I needed for most of the tasks, yes. It even played Chuck Yeager Air Combat, with flight scenarios I never ever encountered again. The graphics of "Sturmovic" are much better but the flight experience was miles ahead with Chuck Yeager. I even managed to use it again on a DOS emulator recently

        2. Mac Logo

          Re: Word 6

          Totally agree, but you can bet that every Reg Reader will be in the 10%. Or at least think they are. ;)

        3. Lurko

          Re: Word 6

          "As others comment, we seem to have stagnated with innovation in the past 30 years and just opened ourselves up to being spied on, sold, and tracked 24/7"

          Can't dispute the spied, sold, tracked, on the other hand I'm noticing a lot of rose-tinted goggles being worn round here.

          I recall a very different late 80's and early 90's to many of the commentariat: Shit hardware with dubious compatibility, very limited storage and RAM. A need to buy and install your own graphics, sound cards, and network cards because most computers didn't have that built in. Regular BSODs. Slow networks (if networked at all). Garbage tools like the WYSIWYG overlay for Lotus 123. Zero consistency across UI on most apps. Incompatible file formats left right and centre. Crappy, low resolution screens (colour, if you were lucky). Sloooooww, heavy, portable computers. No credible mobile capability to view, access, or edit mail of documents on the hoof. Useless floppy disks that seemed to have a 50% chance of becoming corrupt if somebody broke wind in the same building. Fuck all in the way of sharing content or accessing from different locations. An internet where the height of entertainment was Usenet, or garish badly designed websites. Web graphics that these days you'd immediately assume to be a screenshot from Minecraft. Sod all e-commerce capabilities. No public administration capabilities online. Rubbish availability of information (like real time air or rail data). No competent mapping or geolocation services. Primeval digital photography. Etc etc.

          And apologies if I missed it amongst all the bashing of Microsoft, but the required response is "What have the Romans ever done for us?"

      5. ecofeco Silver badge

        Re: Word 6

        Wordpad has all the features I've ever needed. Still use it.

        Thank god MS has not killed that off... yet.

      6. C R Mudgeon

        Re: Word 6

        "Solitaire has been free in every version of Windows up to 10, so why change it?"

        Because they can.

    3. Anonymous Coward
      Anonymous Coward

      Re: Computer did get faster, software did get bloated.

      I just watched a vid on Elite - an open universe game that wasn't just a programming marvel but also a brilliant game and all in ~22kB.

      1. MyffyW Silver badge

        Re: Computer did get faster, software did get bloated.

        Or Frontier, the only game to simulate our entire galaxy from spiral arms to 1m topographical features on a 720KB disk

      2. Jou (Mxyzptlk) Silver badge

        Re: Computer did get faster, software did get bloated.

        I played Elite. On a friends C64, and later the DOS-VGA 256 color version. Which plays identical, but with better GFX. Amazing game for its time.

        1. Crypto Monad Silver badge

          Re: Computer did get faster, software did get bloated.

          No!!! Now I have the docking music going round in my head!

    4. Mike 137 Silver badge

      Re: Computer did get faster, software did get bloated.

      Yes indeed. We keep a Win XP machine, mainly to drive high end scanning and colour printing kit for which no later drivers are available. It runs on a single core 1 GHz Athlon. The (not very funny) joke is that equivalent tasks can actually run faster on it than on my 2.8Ghz i5 laptop running Win 10.

    5. StrangerHereMyself Silver badge

      Re: Computer did get faster, software did get bloated.

      We now got billion-instructions-per-second processors which are being reduced to a crawl by shitware like Electron and JavaScript.

      1. ecofeco Silver badge

        Re: Computer did get faster, software did get bloated.

        It's almost all shitware. Well, anything in MS, Android and Apple word. That's the problem.

    6. John Riddoch

      Re: Computer did get faster, software did get bloated.

      It became cheaper to double the RAM/CPU than it did to have a programmer fix the issues.

      40 years ago it would be 10s of thousands to upgrade your server, but say £5k of programmer time to optimise your code in assembly. Now it's 10s or 100s of thousands of programmer time to optimise the horrible spaghetti Java code, or £5k to add another server to the web farm. There's no point optimising code to be efficient any more unless you're doing specific embedded software on micro controllers and even then, it's probably still cheaper to buy a Raspberry Pi to do the job...

    7. Plest Silver badge

      Re: Computer did get faster, software did get bloated.

      Some people still care and since moving from Python to Go and Rust I've found I need to up my game a lot and actually knuckle down and remember what i was taught in CS coding classes, else my code runs like a dog.

  4. PhilipN Silver badge

    Seagate

    .. had to ramp up production to handle all the Doom wads eating up the digital real estate as fast as they could build it

  5. Doctor Syntax Silver badge

    "Later, Mosaic Corp evolved into Netscape, and that begat today's Mozilla."

    And Netscape Communicator is still alive and hiding under the name "Seamonkey".

    meanwhile NT has evolved into a monster that looks as if it wants to spend 30 years displaying the message "Preparing to Configure Windows. Don't turn off your computer".

    I wonder if Microsoft have an entire department devoted to designing Wating graphics. It has a lot of them but then it needs a lot of them.

    1. vtcodger Silver badge
      Mushroom

      Vasa Syndrome

      "Meanwhile NT has evolved into a monster that looks as if it wants to spend 30 years displaying the message "Preparing to Configure Windows. Don't turn off your computer"."

      Somewhat reminds me of this:

      Vasa or Wasa[a] (Swedish pronunciation: [²vɑːsa] ⓘ) is a Swedish warship built between 1626 and 1628. The ship sank after sailing roughly 1,300 m (1,400 yd) into her maiden voyage on 10 August 1628. ... Richly decorated as a symbol of the king's ambitions for Sweden and himself, upon completion she was one of the most powerfully armed vessels in the world. However, Vasa was dangerously unstable, with too much weight in the upper structure of the hull. Despite this lack of stability, she was ordered to sea and foundered only a few minutes after encountering a wind stronger than a breeze.

      https://en.wikipedia.org/wiki/Vasa_(ship)

      Seems to me that many of the problems with modern computing are due not to lack of innovation, but to marketing folk insisting on the appearance of innovation. Potemkin innovation as it were.

      1. Blue Pumpkin

        Re: Vasa Syndrome

        And the museum is fabulous and an unmissable visit when in Stockholm .... https://www.vasamuseet.se/en

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Vasa Syndrome

          [Author here]

          > And the museum is fabulous and an unmissable visit when in Stockholm

          I strongly agree. I was there very nearly 20Y ago and I loved it.

          I love this comparison even more. ;-)

    2. Anonymous Coward
      Anonymous Coward

      "I wonder if Microsoft have an entire department devoted to designing Wating graphics."

      If they do it's probably filled with people who formerly designed error messages. Because these days it's usually just some version of "Something went wrong!" with MS software.

    3. Jou (Mxyzptlk) Silver badge

      And I am still using Seamonkey. Albeit only for mail, since way too many websites don't work with it any more. Requires "Standalone SeaMonkey Mail" add-on, so a click on a http(s) link in a mail opens the default webbrowser.

      1. Doctor Syntax Silver badge

        I'm actually reading and writing this on SM. It's getting to be a problem, though, of trying one browser after another to see which has an acceptable combination of UI and working with a given site. (Dammit - it should be possible to use an expression like "site X" to suggest any given site and now even that option's been pinched by you-know-who.) Even NextCloud has gone down this route. I haven't looked by I suspect it's baked into PHP.

    4. Sudosu Bronze badge

      There was one...I think it was for SCCM or maybe it was for MOM that indicated that the computer was "reticulating splines"...kudos on that one.

  6. Lee D Silver badge

    When I still lived with my parents and brother, I played Doom (and later Quake) over a DOS-based IPX driver that used ordinary serial and parallel cables/ports in a daisy-chain.

    You just ran a DOS TSR, connected either a 9, 25 or parallel cable to another computer which was also running the TSR.

    That appeared like an ordinary packet driver to the OS, so you could run IPX over it.

    Wasn't fast, but you could use it, and everything could see it as just a network card.

    We used that for gaming until we eventually invested in a bunch of ISA NE2000 10Base2 cards and ran the coax between bedrooms (we already had the BNC and T-pieces from an old bag that came with a video recorder years before that used them for video cables! They worked fine).

    I still have, somewhere, about 25m of various serial, parallel, 9 pin , 25 pin, and a bunch of adaptors and gender-changers for every combination that we used to join three PCs (a 386 a 486 and a Pentium if memory serves).

  7. jmch Silver badge

    Improvements???

    "three decades of increase in computer capacity and parallelism has mainly enabled unprecedented levels of OS and app bloat."

    The main advance from a technical use-case point of view has been the intake, processing, transfer and storage of ever-increasing volumes of data. This is mainly determined by the gradual increase over 3 decades of processor, network and storage capacity, but there are very little new ideas. In fact a gigantic amount of this volume of data is either collected unnecessarily (without user knowledge / consent and/or simply 'we might use it later'), and brings little or no value to the users. Or even worse, it is generated automatically by bots in order for other bots to search it, catalogue it and reward it with good SEO rankings.

    From a computer science point of view, the most interesting things are the work on unstructured data. The "AI" fad is mostly applying neural network ideas (themselves 3-4 decades old) using newer faster hardware and never-seen-before volumes of data (which is far too huge to properly sanitise hence some of the the garbage-in-garbage-out and hallucination effects).

    From a hardware point of view the real paradigm shift is quantum computing, and that is at about the state that non Neumann architecture computers were in the 50s or 60s - it will be a couple of decades at least before we reach the '1993' equivalent

    1. Altrux

      Re: Improvements???

      Interesting perspective - but yeah, sounds about right. Given that we had NT in 1993, I think Linux has improved rather more dramatically than Windows since then. Although even then, desktop Linux peaked some time around 2010, and hasn't really improved much since. It's just constant, constant re-invention of the same things, again and again. The 3D compiz/beryl desktop I had in, what, 2006, could do stuff than a modern Ubuntu desktop still can't! Because it hasn't been reinvented yet, but I'm sure it won't be long...

      1. HuBo Silver badge
        Linux

        Re: Improvements???

        Exacto mundo! Nothing happened in the last 30 years, except ... Linux (eg. Slackware)! The vast freeing and democratization of computing, outside of entreprises and academia's labs. The software that now runs HPC, and Raspberry Pis, and smart toasters (aka phones), and goes forth and multiplies. (no mention of Linux in Liam's piece, and only yours so far in these comments!)

        1. Yet Another Anonymous coward Silver badge

          Re: Improvements???

          And cloud. Linux made compute at scale free

          Back in the 90s we were building clusters of Sun workstations but the cost of HW and OS licenses were impossible for anyone not in a University lab

          Early 2000s we were building Beowulf clusters of PCs with cheap Athlons and free Linux - but still juggling getting enough HW from Ebuyer delivered and installed fast enough (and ideally getting the customer to pay before the credit card bill came in)

          Now we just have all the compute we need in the cloud, scaleable on demand

          1. Crypto Monad Silver badge

            Re: Improvements???

            "And cloud. Linux made compute at scale free [...] Now we just have all the compute we need in the cloud, scaleable on demand"

            Except that:

            1. Cloud is anything but free

            2. Computer bureaus (where you rent time on someone else's machines) have been around since the 1960's

            There's certainly a pendulum effect going on though.

        2. Sudosu Bronze badge

          Re: Improvements???

          "and smart toasters"

          Would you like some toast?

          1. ecofeco Silver badge

            Re: Improvements???

            Or maybe you're a waffle man? How about some nice warm toaster waffles?

        3. sabroni Silver badge
          Facepalm

          Re: Nothing happened in the last 30 years, except ... Linux (eg. Slackware)!

          good grief

        4. CGBS

          Re: Improvements???

          The vast majority of people don't even know what linux is and I am not sure how one can view the building of a massive surveillance industry as freeing or democratic.

  8. Peter2 Silver badge

    Although the PC industry is in deep denial about it, as it was in 2021, just as it was a decade before that, computers aren't getting that much faster any more.

    I think we need to separate out this sentence slightly.

    The hardware continues to get ever faster. You can happily buy 16 core processors for desktops running 32 threads. You can buy a 96 core processor running 192 threads with all of them running at circa 5GHz.

    However, there is fuck all point because all of the software is so badly written that it will use precisely one thread running on one processor.

    The net result is that despite the hardware performance being literally thousands of times better than it was in the 1980's, the software runs slower.

    It takes longer to cold boot a modern PC than a 1980's original. It takes longer when the OS is loaded to open a word processor. It takes longer for that word processor to open and display a file.

    1. Altrux

      True - I think LibreOffice 7x, opening a ~300kb password-protected spreadsheet, takes longer than opening a similar document did on Office 2003, obviously 20 years ago. Despite it now running on a 12-core 64-bit Ryzen with dual light-speed NVMe drives.

      1. ianbetteridge

        I'm going to look a *little* dubiously at you because I know memory plays tricks.

        In my head, my System 6 Mac Plus was incredibly fast to boot. And then I watch a video of one actually booting on YouTube and it turns out it took several minutes.

    2. Filippo Silver badge

      >However, there is fuck all point because all of the software is so badly written that it will use precisely one thread running on one processor.

      It's even worse than that. More often than not, that single thread will be hanging on I/O (on a disk that's being hogged by something else that's badly written), or on a network request (gods only know what for). Most likely, to power some feature that has a 1% use-case. So everything is just as slow or slower, even though single-thread CPU performance has also grown significantly.

      1. munnoch Bronze badge

        I wouldn't exactly say "badly written". Multi-threading is hard to do correctly and harder to do in a way that actually yields performance improvement except for some fairly niche applications that have obvious parallellism.

      2. jdiebdhidbsusbvwbsidnsoskebid Silver badge

        "More often than not, that single thread will be hanging on I/O (on a disk that's being hogged by something else that's badly written), or on a network request".

        Windows has always struggled whenever the network isn't responding. Back in version 3.11 if the network was slow, it would hang the machine for a moment even if you the user weren't actually doing any network stuff at the time. XP seemed to be okay in that you could still do something locally albeit a bit slower than usual. Today, on W10 any network issue seems to take the entire computer completely out of action until it's either resolved or that one process finally gives up and gives back use of the processor to something more useful, like moving a mouse or something else more satisfying than watching a high spec PC fail to be able to animate a tiny spinny wheel anymore.

        1. Jou (Mxyzptlk) Silver badge

          You description of Windows 10 hanging, including the mouse stuck, on a network condition can't be Micros~6's fault. Possible reasons:

          1. Dodgy network drivers, even though that is unlikely.

          2. Update BIOS - depending on the CPU it can make a big difference since microcode updates help A LOT MORE than many expect.

          3. unplug some peripherals. With "some" I mean: All except for mouse, keyboard, monitor, LAN and mains cable.

          4. update drivers, even though unlikely unless you have an unknown device in your device manager.

          5. Some software fuped your Windows installation. For example bad AV programs installing a network-scanner driver as "upper filter" and/or "lower filter" on your network card.

          Recommendation: Backup your OS so you can restore it by booting from a stick, update your BIOS, try a fresh install with only the drivers actually needed, i.e. Windows installs them by itself once network connection available. After that only update the driver for the GFX card, since Windows usually get an "old slow but proven stable" version. If that fixes your network weirdness you caught it.

          1. John Doe 6

            Yes... and that description is actually exactly why I dumped Windows

            * it takes too much time

            * it feels just like having a baby - one that never grows up

            * when something goes wrong it is almost impossible to find out what

    3. Paul Smith

      Sorry, but that is simply not true, get yourself some faster storage. I have a dual boot PC running a 5800X. Cold boot to login, Windows 11 or Ubuntu is five or six seconds with SSD boot disk depending on the password. In 1984, it took longer then that just for the CRT to warm up.

      1. Yankee Doodle Doofus Bronze badge

        "Cold boot to login ...five or six seconds"

        My systems with nvme drives definitely boot faster than any machine I had 20, or even 10 years ago, but cold boot to login on a modern OS in five or six seconds? I'm skeptical of your claim to say the least.

        1. Jou (Mxyzptlk) Silver badge

          Re: "Cold boot to login ...five or six seconds"

          I am not skeptical. As soon as the BIOS/UEFI actually loads Windows 8/10/11 getting to the desktop in six seconds is pretty normal on a typical i5/Ryzen 2xxx with SATA-SSD or NVME. However, if you have TEAMS, Steam, EA-Origin, Libreoffice-Quicklauncher etc etc in Autostart you are sabotaged. If you are connected to a domain or login with a "Hey, you need a Microsoft Account Login, it is the best in the world!" it is a different story, 'cause verifying login takes a few extra seconds.

          1. Doctor Syntax Silver badge

            Re: "Cold boot to login ...five or six seconds"

            I have the an admittedly now oldish Asus, dual booted. There is no doubt at all that the W10 partition takes a lot longer to boot than the Devuan partition even when it doesn't decide it needs to configure Windows first. Even when it gets through the initial part of the boot to the completely gratuitous hi-res background image it's strangely reluctant to put up the password box. If all's going well it will display the desktop quickly - but then takes ages to populate the task bar with very little pinned on it or to actually respond to any attempt to launch an application so again Devuan wins the password to running application race.

    4. jotheberlock

      'the software is so badly written that it will use precisely one thread running on one processor'

      Look up Amdahl's Law. If the thing you are trying to do is fundamentally sequential, as is often the case - if you need to compute a before you can compute b before you can compute c - then it doesn't matter how many cores you have available, you are going to be stuck using one core for that thing. Not that there aren't cases where they could help and aren't utilised as much as they could be but more cores/threads are not a panacea. A 96 core processor does not and never will do things 96 times as fast as a single core.

      1. Filippo Silver badge

        That's all true, and that's exactly the point. The software is badly written because sequential design is still the default, even though very little is actually fundamentally sequential.

        Startup of bloated programs (OSes, word processors, IDEs...) is a particularly painful case. We shouldn't be loading all of the bloat before handing control to the user, just because they might decide to invoke some plugin or whatever right away. Instead, we should start core functionality, hand control to the user, and load the bloat in the background; if and only if the user does something that invokes the plugin, they get a "please wait" until that particular bit is loaded.

        Instead, here I am, having to wait for a whole bunch of services to startup before getting my desktop, even though of those services maybe only two are actually a prerequisite for the desktop functionality.

        Of course, it's harder to make software that way. Rework everything to make it async, or just add a teeny 300 msec pause? Who'll even notice? Well, those 300 msec pauses add up (and multiply, even, as cross-dependencies flourish). We should find a way to do better.

        1. Crypto Monad Silver badge

          There are a few observations I'd make.

          1. The sequential performance of even a single core has increased by at least 3 orders of magnitude since the 1980's. DRAM rather less, maybe only 1 order of magnitude - hence the need for three tiers of cache inside the CPU.

          2. Software used to be 4 orders of magnitude smaller. The original Mac in the mid 1980's had a 400KB floppy drive, on which you could put the OS/Finder, MacWrite and MacPaint. Now you'll be lucky to install any useful OS plus application in 4GB. Today's software has more features, but it is not 10,000 times more useful.

          Or to take a PC example: in the early 1990's, a 386SX with 2MB of RAM was plenty to run Windows 3.11 and WordPerfect. Since the software was small, it was quick to load from an old spinning hard drive.

          3. The bootup time of a PC is not CPU-constrained anyway. It's mostly constrained by external storage - loading megabytes here and megabytes there, in hundreds of separate files - and partly by drivers initializing hardware.

          Efforts have been made to parallelize startup (e.g. systemd does this). However the small saving in bootup time is offset by the subtle bugs you get if you've not correctly declared dependencies, and things end up starting in parallel which shouldn't.

          1. Vometia has insomnia. Again. Silver badge

            One of the reasons I hate systemd is its race conditions, e.g. attempting to mount network drives at the start of network configuration rather than after its completion, so it always fails. That was a previous release of Mint, I think they subsequently fixed it, but reading the documentation was of no help at all as it was variously inconsistent, gibberish, wrong or missing, and all of it seemed to be an excessively complicated means of configuring something simple. My late gf's PC still uses it and I'm still not ready to change it to something like MX Linux so I've had to set up a job that retries NFS mounting after systemd has finished booting, though its habit of periodically reconfiguring the network for no reason at all can still screw it up.

            Most of the time I'm not that bothered whether booting takes 5 seconds or 5 minutes, it's restoring my session that's the time-consuming and fiddly part and nothing seems to get it even slightly correct; actually, of individual applications, Firefrog actually gets that one right: even if it can't remember its screen position, at least it remembers my tabs and other shit.

            I suppose tolerance of boot times partly depends what you're used to; back in the olden days my boot times were about 1 second (Dragon 32, albeit to crappy MS Basic with a cassette deck for storage) to around 10 minutes or so for a M68K or MIPS Unix box with lots of services or a VaxStation with VMS. Some of the old beards talked darkly of "most of the day" to get a mainframe from a cold start to a point where it could do useful work but I never knew whether or not they were exaggerating. We were using IBM 3090s at the time which weren't slouches either in CPU power or I/O so I'm not sure about that, unless some of the services were just particularly egregious to coax back into life. Also uncomfortable memories of trying to persuade a Vax mainframe to boot after a scheduled power outage (it was early and its army of butlers hadn't turned up yet) and realising whatever witchcraft its FEP needed wasn't even vaguely the same as the little desktop version.

    5. Sudosu Bronze badge

      I remember the fastest PC I have ever seen; it was a Pentium 90 at a store with Windows 3.1.

      When you clicked on a program (from Microsoft Works), it was open before you finished pulling your finger off the mouse from the second click.

      I have had a lot of current machines over the years, but that P90, with that operating system, it was fast.

    6. ecofeco Silver badge

      It takes longer to cold boot a modern PC than a 1980's original.

      I was with you until here. I could make and then pour coffee by the time my 1980s PC would boot. My current PC boots faster than I can push my chair back.

      What era did you decide was the end of modern?

    7. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      > The hardware continues to get ever faster. You can happily buy 16 core processors for desktops running 32 threads.

      Yes and no.

      A freeway with more lanes does not make it any quicker for one car to move along it.

      (I say "freeway" not motorway because British motorways seldom get to the multilane madness of US freeways.)

      And bearing in mind Dr Andy Tanenbaum's dictum:

      "Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway."

      –Andrew Tanenbaum, 1981.

      More smaller cars makes it _slower_.

      Never mind the width: feel the quality.

      Adding more cores does not help after a certain point. This is a hard limit, called Amdahl's Law:

      https://en.wikipedia.org/wiki/Amdahl%27s_law

      So, while you're right up to a point, the thing is that there *is* a point. Single-thread performance remains more important, overall, than multithread. Adding more faster storage only helps so far. When you get to that point, where stuff stops scaling, you have to explore other avenues.

      Bigger computers only absorb bigger software for so long. After that the only way to make them faster is to make the software _smaller_.

      We are at that point. Nobody's seriously trying. They're pruning the edges into pretty shapes (GNOME >= 3, kernels dropping whole types of drivers, new languages banning certain types of operations on variables) but nobody dares take an axe to the trunk of the tree that bears dollar bills as fruit.

      1. Yankee Doodle Doofus Bronze badge

        "but nobody dares take an axe to the trunk of the tree that bears dollar bills as fruit."

        Poetry! Well said indeed.

  9. Anonymous Coward
    Anonymous Coward

    Why so little change..that's easy to explain

    It all died in 1997.

    1) Because in 1997 the SEC changed the very obscure Initial Investor Rule which mean that all VC financed startups quickly became just Pump and Dump scams. Hence the Dot Com 1 bubble. Which blew up in 2000.

    As a by-product the start up software business was almost totally financialized become little more than the public face of a financial scam as the VC's moved to making all their money from management fees rather than actually cashing out successful companies. Who made useful software. That made money. So since 2000 the fully financializd software startup business a.k.a Dot 2.0 has just been various schemes for the VC's to fleece various Bigger Fools. Usual desperately searching for yield in a very low yield post 2008 world. Actual useful (non buggy) software that did something new and innovation and made money, not going to happen. VC's are about risk adverse to new ideas as Hollywood / US TV script buyers. Sheep with money .

    2) In 1997 Amelio made the huge mistake of letting Steve Job back inside Apple. And Job's quickly killed the whole ecosystem of small / medium sized Mac software companies who had been the driving force behind almost every new software category since 1984. All gone by 2000. Because all your (revenue) base are belong to us (Steve).

    Pretty much every big hit Win 16/32 product of the 1990's first shipped on the Mac. Since 2000 the MacOS software market has not only been a very small niche (<2% revenue) but has not broken one single innovative product since. One that created a new market segment. With serious revenue. Which had been pretty much a yearly occurrence between mid 1980's and mid 1990's in the MacOS ecosystem

    3) In 1997 it became very obvious that Andreessen utterly boneheaded "the Browser is the Platform" public statements had not only killed Netscape by putting a huge bullseye on them that MS could and would hit not matter what the cost but it was now obvious that Netscape could not even competent technically with MS's panic buying of the Spyglass code-base license which was renamed Internet Explorer. So losing the Web browser war to MS for a decade. Until the rewrite of Navigator took off due to the technical incompetence of the MS IE team. But in the end the old Netscape tradition of technical incompetence killed Firefox too. I know. Lets create a new language. That will fix the problems in Firefox. Giving the crown by default to Google Chrome. For the next decades until it has now suffered its own inevitable code collapse in the last year or two.

    And that, boys and girls, is why the software business has been so boring, so uncreative , and so derivative for the last 30 years. Blame the VC's mostly. If you thought a typical High Street bank manager was unimaginative, conservative and risk averse to innovation then you have never met any hucksters in very expensive suits who have infested Sand Hill Road and its environs for the last 25 years.

    There is no reason innovation should have stopped. The raw processing power now available is staggering. There has been some software innovation on mobile platforms but the form factor prevents anything truly breakthough happening. Desktop is the platform for radical breakthroughs. But will not happen with the current VC 2/20 model. And with the IRS allowing the various VC vehicle financial scams to continue.

    1. Zolko Silver badge

      Re: Why so little change..that's easy to explain

      It all died in 1997

      funny that: it's the year I discovered Linux, with KDE 1. Since then, mostly nothing happened. Even Chrome is only khtml from Konquerror of that times. Guess Liam will have to re-write this article in 4 years.

      1. Anonymous Coward
        Anonymous Coward

        Re: Why so little change..that's easy to explain..nothing much happened there either..

        First built Minix in 1987 and ten years later saw little new in Linux 2.0. Now the OpenSolaris source code which was released a few years later, now that was a fascinating read.

        KDE was basically a cross between Digital Research GEM and HP New Wave with some minor bells and whistles thrown in. Both of those were late 1980's. BeOS was doing far more interesting things around the same time and NeXTSTEP was forever betaware which looked good at trade-shows, not so pretty up close.

        By 1997 the various HTML rendering engines were usable, JavaScript mostly worked (but very slow), but all the heavy lifting was done using the CGI. Or plugins. And it remains that way for the next decade. But eventually the HTML 4.0 rendering was fast and stable, JS was now incrementally compiled and VM'ed, and CSS 2 (finally) worked predictably. Cleaning up the dogs dinner that was CSS 1. But none of this was revolutionary over what was kicking around in 1997. Just incremental improvements. Or rather getting the W3C specs to actually work as advertised.

        So not a lot has happened really in either of those parts of the software universe either. Considering how much had changed 1987 to 1997. Less so in the Unix world. Where it always seem to be stuck somewhere around 1978. In the Disco Years. Decade after decade. Because pretty much everything "new" in the last 30 years in Unix/Linux land is just a rehash of what was seen on various Unixes between the mid 1970's and late 1980's. One way or another.

        I think I'll go fire up some Prog Rock now. To remind me of the "good old days". Cue 10 minute drum solo. .

    2. wobball

      Re: Why so little change..that's easy to explain

      Personally I think the desktop format is a huge drag on innovation.

      I'd hoped we'd all be chatting freely with our digital assistants by now with some version of G Glass hooked up to our personal devices aka phones.

      With some serious back end support as per G Goggles and some of that there AI to help.

      Now THAT would be an innovation despite it all being pretty current and implementable.

      1. Anonymous Coward
        Anonymous Coward

        Re: Why so little change..that's easy to explain..desktop is still king

        For comms, notifications, low res browsing , single task low complexity apps, non-desktop often works. Any sort of serious content creation, editing, basically work, the desktop / laptop format is still king. Just like it has been since the first VDU terminals in the 1960's.

        Your "fantasy platform" is no different from any number of "future innovations" video demos that have been done since the 1970's. Although the most infamous one (i.e became an industry joke) , John Scullys "Knowledge Navigator" demo video in 1987 (from Apple), when it was made this genre was already a hackneyed cliche. In 1987. The Starfire technology simulation video done by Bruce tognazzini at Sun in the early 1990's was the only one I've seen that was in anyway rooted in reality. Pity it never worked out. Everything else, little more than ill informed Hollywood VFX / CGI eye candy.

        I'd guess you never seen the pure hatred directed towards one of the early Glassholes when they tried to wear a Google Glass headset in public. I remember one Tech Bro wearing late beta / early release Goggle Glasses in a cafe in San Francisco. He lasted less than 10 mins before he took them off and put them away such was the death stares directed towards him. If he had not I would have reminded him of local laws. Which he was breaking every time the red light came on. They came and went as something you saw in public in a few months.

        So I'm afraid all those ideas are total non starters. In the real world.. The sort of things that will only ever be proof of concept simulation demos at User Interaction / Interface Conferences. Or on a video loop at Trade Show stands.Thats all.

        I hear you can still get really good deals on the discontinued Google Glass gear. Nobody wants them.

  10. Bebu
    Windows

    Never seemed to gain critical mass...

    The one hardware/software technology that I thought 30 years ago the might have sprouted wings was pen computing. Back then probably called light pens. I recall having high hopes for GO Corporation PenOS (defunct 1994) and later the Apple Newton but even today pen based systems are the exception. Handwriting recognition on an Android tablet (GBoard) is pretty decent even with my scrawl.

    I suspect that adding pen support to an existing interface/system doesn't fully utilize the distinctive features of the pen interaction. Designing a pen centric system from scratch might produce the critical mass required.

    I always imagined writing writing mathematical equations or formulae from predicate calculus using a pen would be a great deal easier than typesetting them with eqn or latex. Typesetting something like the Schrödinger time independent equation in eqn is just plain laborious.

    I can not wait another 30 years - I would be getting a telegram from the queen... likely delivered in person.

    1. ianbetteridge

      Re: Never seemed to gain critical mass...

      Apple certainly seems to be selling quite a lot of its "Pencils" (I hate that they're calling it that).

    2. Neil Barnes Silver badge

      Re: Never seemed to gain critical mass...

      Years ago I imagined an application to run on something like a tablet (long before such existed): one drew a circuit diagram of arbitrary complexity on the screen, and the device emulated/simulated that circuit in real time - take input and output from e.g. the headphone and mic/line sockets.

      We're finally at the point where such might be practical... and there's no longer a need for it. Oh well...

    3. Zolko Silver badge

      Re: Never seemed to gain critical mass...

      For me it would be drawing by hand, it's so much faster than using any graphics software.

  11. StrangerHereMyself Silver badge

    Monopolistic stagnation

    Monopolies eventually stagnate, and in the case of Microsoft the company has only been adding stuff that benefits its bottom-line since Windows XP. Back then it was audio and video (Secure Channel) and later it became online advertising and cloud. And in the future we'll probably see AI in Windows 12 everywhere.

    The core operating system hasn't changed, or barely.

    1. ianbetteridge

      Re: Monopolistic stagnation

      Microsoft has been adding stuff which benefits its bottom line since 1976. It's a company, they tend to be interested in that sort of thing.

      1. Geoff Campbell Silver badge
        Windows

        Re: Monopolistic stagnation

        Indeed, and one might argue that it benefits their bottom line precisely because MS are producing products that the mass market of users actually wants, or indeed needs, to use.

        This is not a popular opinion around here. Like I care.

        GJC

        1. Doctor Syntax Silver badge

          Re: Monopolistic stagnation

          They have a captive user base. That's the only reason they can abuse their customers in this way.

        2. Ian 55

          Re: Monopolistic stagnation

          You did see that OpenAI, a company that Microsoft has given billions and billions and billions to, won't us MS Teams, didn't you?

        3. ecofeco Silver badge

          Re: Monopolistic stagnation

          You could argue that. You would be wrong.

          A quick google returns many results for government fines and class action suits for illegal business practices over the last 30 years.

        4. StrangerHereMyself Silver badge

          Re: Monopolistic stagnation

          I believe you're wrong. No one is wanting for having to pay Microsoft every month to have access to their "virtual" PC in Azure.

          Their first commitment is always to their bottom line and not what customers need or want. Remember the schizophrenic mobile interface in Windows 8 (and parts of which are still with us today)? Merely driven by their desire to dominate the mobile space too.

  12. IGotOut Silver badge

    Sure...

    Here I am writing this on a device in a Gregs, that has more power them most back then could dream of. Then I'll upload a photo of a quality that only pros could take after spending hours setting up, that then can be viewed by billions of people anywhere in the world in fractions of a second.

    Then at the weekend I'll travel somewhere I've never been before using the same device to navigate me with in a few metres of my destination. There, I'll film a video that would of cost 20s of thousands of Pounds of equipment previously, on the same handheld device. I might do a 30 second edit that wouldve taken hours, throw in a few effects that would of cost a year's salary to do.

    Then later in the evening I'll choose something to watch or listen to from millions of instantly obtainable choices, transmitted from the same handheld device to my home hifi.

    TLDR,?

    You don't notice the groundbreaking innovations, because they are everywhere around you and are taken for granted.

    1. heyrick Silver badge

      Re: Sure...

      On the other hand, the innovation of having damn near everybody always connected means that nobody needs to care any more.

      Once upon a time, simpler software was burned into ROM. It may have a few bugs quirks, but on the whole it worked because it got very thoroughly tested before anybody committed to making a batch of ROMs. Even releasing stuff as EPROM was costly and took time programming them. So getting it badly wrong could well sink a company.

      These days? Updates are near instant, just shovel out any old crap that sort of works and deal with the bugs when the users complain (makes you seem responsive), then release a new version (etc etc etc). Yes, it's great to have easy updates, but then it would have been great to see my calculator app not having five updates since I've had this phone. Five. What the hell do they manage to get wrong in a basic calculator? (that they clearly failed to fix the first time)

      I suppose the biggest and most important "innovation" is how the professional software industry managed to convince us that this sorry state of affairs was "normal".

      1. Anonymous Coward
        Anonymous Coward

        Re: What the hell do they manage to get wrong in a basic calculator?

        The curve at the corners of the buttons wasn't quite right, there were too many, then too few, pixels between this row/column and that one, and the background of the number display was just the wrong shade of slate blue.

        Duh.

    2. Headley_Grange Silver badge

      Re: Sure...

      I did notice these groundbreaking innovations.

      "Here I am writing this on a device in a Gregs, that has more power them most back then could dream of."

      True - but I noticed that on my Blackberry 7230 20 years ago - and it was a lot easier typing than on my iPhone today.

      "Then I'll upload a photo of a quality that only pros could take"

      True - but most people will see it on a screen smaller than the 6x4 prints I used to get from my Olympus Trip in the 80s.

      "..using the same device to navigate me.."

      True - and I noticed that on my Blackberry 8310 17 years ago.

      Video and editing - not in my use-case so I don't know how far back I'd have to go, but I'm certain you're right in this case and the benefits of speed, memory etc. are much more applicable.

      It obviously depends massively on your use-case and I'm not pretending for a moment that I sit close to the centre of the normal curve for mobile device use. For me, though, the article's main thrust it true. My iPhone might be massively more powerful that the Blackberries of the 00s but it's not much more useful to me than the BB 8310 was back in 2007ish. The reason I binned my Blackberry Classic wasn't due to its lack of oomph, it was because it didn't integrate with anything else I own; Notes, ToDo, Contacts, Calendar, music, podcasts, etc. - none of which needed much processing power - were all a nightmare because of Apple's approach to walling in its products and services. If Apple brought out a device with a keyboard of the same quality as the Classic with its inbox integration then I'd snap one up.

      1. Anonymous Coward
        Anonymous Coward

        Re: Sure...

        And the prints from your Olympus trip were available in under a second of course.

        And if you didn't like them you could reframe and retake them immediately ....

        The real innovation and take up is outside of this community - 30 years ago it would not have been possible for huge sections of society to post the unbelievable amounts of rubbish that they do now ... am in two minds as to whether this is progress however

        1. Zack Mollusc

          Re: Sure...

          Well, you could actually look at the prints from your Olympus without being nagged to create an account and having to seperately opt-out of 10 billion cookies from 10 billion intrusive jerks and without having to fight off banners and adverts sliding over the content from all angles. Also without having to pay for the bandwidth and processing of all those annoyances.

          1. Ken Hagan Gold badge

            Re: Sure...

            I can look at my *own* pictures without having an account with anyone.

            Sharing pictures with others is more hassle, but that was true in the print era as well (and was quite pricey as I recall).

    3. Anonymous Coward
      Anonymous Coward

      Re: Sure... but thats all hardware inprovements, not software

      Back in 1996/1997 got to play with lots of toys. We had CDMA / TDMA cell phones. Lots of them. We had a whole bunch of digital cameras. For one project had a whole bunch of broadcast / near broadcast quality video cams. And the cards to get the video onto the Macs for video editing.

      The various handhelds PDA had been kicking around for years. Not just Apple Newton but the various Magic Cap handhelds.. And dont forget the Psion 3. The first real mass market success PDA, the Palm Pilot, came out in 1997. As did the first 802.1 wifi network access points. Although it was not for another year the laptop chip support was there. And 1997 was when I finally broke down and got DSL at home. 3.5Meg down/512k up. Although everywhere I had worked had high speed backbone access (high MBit) since early 1993. Getting those work orders though PacBell in 1993/94 was always fun.

      So what you see in your phone is a miracle of hardware engineering and hardware integration. Thats all. On iOS you are mostly running the core NS stack from 1993/95 with the mobile stuff tacked on after 2007. On Android you are running pretty much the core 1996/97 JVM / JDK with some rather nice UI / device libraries on top. But all that software in apps you are running on your phone was running in 1997. Just on a very different form factor. Over multiple devices/peripherals. With very different UI's. And very very slowly.

      For reference the high end Mac in 1997 had a 250mhz PPC604 and the one I used had about a $1K+ memory upgrade. To over 200Meg. You could get 300Mhz Pentiums but in head to head bench-marking of high end video processing code the PPC was a good 50% / 100% + faster per mhz. The SOC processor in my Garmin watch has more processing power and memory than the stock high end machines we used in 1997.

      If only software had advanced like hardware has. Thanks hardware guys.

  13. Ian Johnston Silver badge

    I'm writing this on a 2009 ThinkCentre M58, which does me just fine. Currently running a bastard combination of Linus Torvald's attempt to ignore 30 years of kernel design couple with Stallman's attempt to recreate a fifty year old command line interface, but I have hopes for Haiku.

    1. Doctor Syntax Silver badge

      So you're telling me that kernel design was good enough 30 years ago and CLI was good enough 50 years ago. Agreed. I'll add that GUI design was good enough about 25 years ago. And search was good enough 20 years ago. Some of these have gone down hill since.

  14. ianbetteridge

    The thing that people don't seem to have cottoned on to is that conversational interfaces are the operating system of the future, and will replace or augment windowing GUIs for many tasks. Go back and look at Apple's “Knowledge Navigator” video of 1987(!) and you get a pretty decent idea of where all this is heading. And I don't think it's that far away.

    My worry about that is that it's all going to be cloud-based and thus effectively rented, rather than locally based and owned (and tinkerable).

    1. Doctor Syntax Silver badge

      If, by conversional interfaces, you mean something that ignores precisely given commands and instead tries to double guess what I want, that's not a future I particularly want.

      1. Headley_Grange Silver badge

        The only thing I use voice recognition for is to add Reminders while I'm driving. You know how it is, something pops into your mind while driving, so I tell my iPhone, via the car interface, to add a Reminder. It's most entertaining, especially after a long trip, reading the resulting reminders and trying to work out what I wanted to be reminded about. I'm not Scottish, I have a southern English accent, I haven't got a particularly deep or shrill voice, but my iPhone doesn't seem to be able to understand about half the words I say and in my experience trying to use voice recognition on the iPhone is as bad as those two Scottish blokes in the lift.

        1. yetanotheraoc Silver badge

          reminders

          "It's most entertaining, especially after a long trip, reading the resulting reminders and trying to work out what I wanted to be reminded about."

          Why not just record your voice?

          https://support.apple.com/guide/iphone/make-a-recording-iph4d2a39a3b/ios

      2. Zack Mollusc

        Relax, scrote, the conversational interface will record and datamine your precise commands and return whatever the highest-bidding third parties want you to see.

    2. heyrick Silver badge

      "conversational interfaces"

      Hey, Jessica, can you make me a bowl of linguine please?

      <house promptly burns down>

      <three hours later>

      "Your sushi kebab is now ready. Please drive six thousand miles across the Atlantic in order to collect. Thank you."

    3. Dan 55 Silver badge

      They said Minority Report style interfaces were going to be the interface of the future until people worked out that it had a major problem in the shape of Gorilla Arm.

      Then a unified phone/desktop UI was going to be the future until people worked out it was terrible at both things but especially desktop UIs.

      Then Google Glass was going to be the future of interfaces until people worked out they looked like Glassholes.

      Now conversational interfaces are going to be the future yet they can't even do homework properly.

      1. Jeff3171351982

        glassholes kind of forked to dingleberries with cameras on people's doorbells

  15. Jeff3171351982

    Telemetry

    The biggest change for me is telemetry. I became aware of it with Windows 10. It is ever-increasing. I use Linux not because I love it, but because I try to reduce telemetry. Almost every time I consider engaging with some technology (apps,tvs,transport,discounts), I think about telemetry and try to find the path, which has the least of it.

    1. karlkarl Silver badge

      Re: Telemetry

      It came with DRM.

      People should have said "no" back when it was introduced in Windows XP.

      Too late now.

    2. Geoff Campbell Silver badge
      Windows

      Re: Telemetry

      Why do you do that?

      No, don't knee-jerk me with some abusive rant, please think calmly and carefully and explain to me the actual reasons behind that decision, and allow me a minute or two to lay out why I don't care at all.

      Thing is, if you're running, say, Microsoft Windows on your PC, you already have everything you are doing, creating, and consuming within the bounds of Microsoft's OS and applications. You might care about that, you might not - I don't, personally. But the point here is that extending that boundary out to some servers in a data centre somewhere that are 100% owned and controlled by Microsoft makes absolutely no difference whatsoever to what you can know about what Microsoft are doing with that data. It's an emotional, illusionary difference.

      If you don't trust Microsoft, don't use their products. That's cool, and a perfectly logical stance. Saying that you trust Microsoft just so long as they only control everything in your local PC only makes sense if you fully, Tempest-level, isolate that PC from any external communication whatsoever. And actually probably not even then.

      And saying that you trust, say, Apple more than you trust Microsoft is also just nuts. They are all just corporations. Trust no-one, ever. But don't pretend that setting some geographical boundary on trust/don't trust is just nuts.

      Me, I use Microsoft products because they are under the gaze of a billion hackers and regulatory wonks world-wide. Because nobody trusts them. So I can be reasonably certain that any abuses they try will be fairly swiftly picked up and publicised. And also because I have a commercial relationship with them, so I have a legal recourse if they start screwing me.

      Once again, not a particularly popular view around here. And once again, I care not one jot.

      GJC

      1. Doctor Syntax Silver badge

        Re: Telemetry

        Go and look at Microsoft's T&Cs. Look for what they specifically exclude themselves from grabbing from your PC. Is it enough? And why should they grab anything?

      2. heyrick Silver badge

        Re: Telemetry

        "absolutely no difference whatsoever to what you can know about what Microsoft are doing with that data. It's an emotional, illusionary difference."

        Didn't downvote, but you know, there was a time not so long ago when you could buy a piece of software and install it and it did stuff for you. Even better, you could yank the network connection out the back and it carried on doing stuff, it didn't fall out about licensing errors or whatnot. Better yet, you could leave the network cable in as there's a pretty good chance it was just a LAN with no internet access. Stuff worked, stuff (mostly) did what its adverts said it did, and very few things bothered to contact the mothership.

        These days, it's rare to find something that doesn't try to spew to the mothership with worrying regularity. And these aren't necessarily random pointless data points. You can assume that anything stored in the cloud is being mined for information, key phrases and such. Add to that (certainly with mobile) a fairly continual reporting of your location. From this it's not hard to determine things like where you live, where you work, where you shop, if you have children and if so where they go to school... all of these little individual data points (and don't believe the bullshit about "anonymised") can make up a good enough idea of the kind of person you are, income bracket, interests, family connections, etc etc.

        "if you fully, Tempest-level, isolate that PC from any external communication whatsoever"

        Once upon a time, this sort of thing wasn't really necessary. Sure, Windows would try to call home to check for updates and maybe check if the licence was valid, but above and beyond that it wasn't normal to spew lots of information. These days, it's just accepted as how things are.

        "so I have a legal recourse if they start screwing me"

        Really? You accepted the updated terms and conditions briefly shown to you on the 12th of October upon which you clicked "Accept" after three quarters of a second to make the annoyance go away...

        "And saying that you trust, say, Apple more than you trust Microsoft is also just nuts."

        Totally agree. Apple has a very well polished image. Microsoft is the old Evil Empire (complete with rusting Death Star and epic stormtrooper music). But at their heart they're tech companies out to milk you for every cent they can get, so nobody should pretend that one is any holier than the other.

        "extending that boundary out to some servers in a data centre somewhere"

        Which is all well and good until there's some sort of connectivity problem at which point you'll be thumb twiddling.

        Yes, I know you can say "if the harddisc fails then you still have copies of your files", both sides have pros and cons. My main concern about putting anything in the cloud (which I rarely do) is that once it has left the zone that you personally oversee, you don't know who has access to the files, and who may have made copies. This probably isn't a big concern for domestic users, but companies sticking their special recipes, new product ideas, and HR into the cloud? Well, wouldn't that be a gold mine for competitors. What a shame it would be if somehow a copy of that stuff ended up on a USB key... ...now assure me that this sort of scenario has never happened. You can't, and it probably has. And the kicker? Once upon a time espionage required breaking into places, working to compromise disgruntled staff members, that sort of thing. Now all you really need to do is target those who hold the keys to the data centre, because in these cloudy days, that's where all the good stuff is.

      3. Jeff3171351982

        Re: Telemetry

        Some people may prefer not to have their in-laws visit. Even if one's in-laws never visit one's home, the in-laws will still--over time--know what one is up to. I actually like my in-laws, and am happy to have them visit and stay at my home. However, I would not be able to work with them sitting in my office, even if they keep quiet.

      4. Doctor Syntax Silver badge

        Re: Telemetry

        I wish I'd remembered earlier Clive of India's comment which seems to sum up what you suggest to be Microsoft's position on this: "I stand astonished at my own moderation"

        1. John PM Chappell

          Re: Telemetry

          Good old Clive, a model of restraint, compassion, and enlightened governance of the masses.

          https://en.wikipedia.org/wiki/Robert_Clive for those ignorant, or curious about details.

    3. ecofeco Silver badge

      Re: Telemetry

      Oh you are just going to love Win 11!

      It makes Win 10 look downright discrete and trustworthy! I've spent almost a year chasing down and shutting off all the goddamn reporting it does. And I'm still not sure I've got everything.

  16. none_of_your_business

    Yes, easy to think nothing much has changed and in some ways you're right that it hasn't.

    But lots of things are way better than they were 10 or so years ago or at least accessible to people without tons of money if they existed back then.

    Tablet computers with touch screens would be the prime thing... didn't exist as a realistic proposition prior to 2010. Neither for that matter were smart phones a big thing.

    Then there's display technology... remember the standard 1366x768 15.6" laptop display? Try looking at one now, they're really rather awful in retrospect. Now 1920x1080 is entry level and a 4K display isn't mega bucks.

    And battery life? 2h30 realistically from a 10 year old laptop even when new... noisy fans, spinning rust hard drives, CCFL backlights sucking the power. Battery tech hasn't improved much if any since then but you get so much more for so much less power draw these days.

    Software, ok -- some improvements but more like one step forward two steps back (three steps if you're Microsoft :-p )

    1. heyrick Silver badge

      "accessible to people without tons of money if they existed back then."

      This.

      I have 200GB/month on my phone contact, which is about 199.5GB more than I need, but...

      Fifteen years ago, with a Nokia feature phone running J2ME so could manage a very simple Opera. At that time, I think the going price of data was around €0,50 per 10K. Yes, ten kilobytes. Shortly after my contact offered 500MB/month and, well, it's just kept growing since then.

      So... Fifteen years ago an average 2mpix JPEG (as was common at the time) would cost about €25 to receive. Now? Now I could watch Netflix endlessly for the entire month and still have some allowance left over.

  17. steveclark

    I loved this article. Thank you. I'm old enough to have lived through all of this. I'm therefore also old enough to have run a beta version of Windows NT on 64 bit architecture; it ran natively on the DEC Alpha chip from day one. So the combination of a 'modern' (well, pre-emptive multi tasking) operating system and 64 bit processors was all there in 1983. And blimey it was fast.

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      > a beta version of Windows NT on 64 bit architecture; it ran natively on the DEC Alpha chip from day one

      I am really glad you liked it.

      However, a small correction: Windows NT on Alpha was 32-bit, from the beginning to the end.

      https://www.theregister.com/2023/05/19/first_64bit_windows/

      There _was_ a 64-bit version internally, used to make sure the codebase was 64-bit clean, using old Alphas because Itanium wasn't shipping yet. But 64-bit Windows on Alpha never shipped.

      1. Jou (Mxyzptlk) Silver badge

        > There _was_ a 64-bit version internally, used to make sure the codebase was 64-bit clean, using old Alphas because Itanium wasn't shipping yet.

        Ah, so you watched Dave Cutler and Raymond Chen visiting Dave's Garage too? :D

    2. 9Rune5

      NT 3.1 ran on two different CPU architectures (x86-32 and MIPS)

      I think Steve is correct in hinting that NT supported DEC Alpha from the first release of NT.

  18. corb

    If Microsoft was selling an NT 4.0 for current hardware, with better looks, and nothing more, I'd probably be using it.

    1. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      >

      If Microsoft was selling an NT 4.0 for current hardware

      I wouldn't.

      [1] No USB? No FAT32? No thanks.

      [2] Better *looks?* Who cares?

      A modern Win2K and I will listen.

      1. Anonymous Coward
        Anonymous Coward

        NT4...you want USB/FAT32 no problem..

        Win2K was just the NT4 codebase with even more stuff stuck on top. Mostly UI and networking doohickeys. So if you want to stick the USB/FATS32 support from the Win2K sourcecode into a NT4 build, no problem. Although the USB 1.0 stack shipped in Win2K was an unstable mess so the USB stack that shipped with XP (NT 5.1) would be a better bet.

        The Win 7 codebase is more NT 5.4 than a 6.1. Its pretty much just NT4/Win2K with tweaks.. For all the stuff that matters. As for Win 10 and Win11. When you look at the API's used by 99.999% of end user applications its really just NT5.41 / NT5.42.

        Who thought a failed VMS kernel rewrite originally from DEC would go so far.

        As for backward compatibility. As no one inside MS every actually understood how the whole codebase worked (at least since the Portable OS/2 days) every new feature was just added to the build. No one dared do any serious surgery as the NT codebase especially after NT3.51 was the classic Jenga codebase. I have nt done a deep dive on the more recent sourcecodes to check but all the Win16 stuff was still in the XP codebase. Which told some interesting stories of their own. Like some very big "yes, we rewrote this API" porkies that very senior MS people told at the time.

        There again MS was always a "you know they are lying because their lips are moving" kind of place. Ever since the days the office was beside the burger place on Northup in Bellevue.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: NT4...you want USB/FAT32 no problem..

          [Author here]

          > Win2K was just the NT4 codebase with even more stuff stuck on top.

          You know, I agree with your overall point, but that one point I have to quibble with.

          NT 4 was a relatively conventional '80s/'90s workstation OS. It needed to know what disk controller it would boot from, and then what disk, and then what partition etc. Do something naughty to it like changing the disk partitions and it would refuse to boot with a BSOD.

          NT 5, AKA W2K, changed all that, by embedding PnP into the core of the OS. It could cope with its disk controller moving IRQs on the fly, or the disks being moved to another device, or the partition being imaged from an EIDE disk onto a SCSI one or something, and if the system firmware could still find it, it would boot.

          (OK, it was far from perfect, but it sometimes worked.)

          It was a radical rewrite in that way. The hardware config wasn't hard-coded into some config file: it discovered it on boot. NT 4: unplug the serial mouse and your mouse pointer disappeared! You needed to navigate the UI by keyboard to add a PS/2 one. W2K: a short pause and it found the new device, or devices, and the pointed started moving again.

          Same for disk controllers, displays, everything.

          This was a game changer. Not only could it cope with stuff moving, and new stuff appearing, it could cope with stuff disappearing or being turned off. That meant power management suddenly worked. The firmware could power stuff off, or slow it down, and the OS kept working unperturbed.

          In real terms, NT 3.x was the first release, NT 4.x was v2, with a much needed UI overhaul and the deeply misguided move of putting GDI inside the kernel, and NT 5 was v3, which made it all work nicely on modern PC hardware and dropped all the 20th century RISC kit.

          Which is why I picked it. It was a classic. With modern drivers and some modern Internet software, it did all I needed.

          1. Anonymous Coward
            Anonymous Coward

            Re: NT4...you want USB/FAT32 no problem..PnP and other stuff

            I'd have to check in the source code NT4 v W2K but from what I remember PnP was purely a BIOS / WDM device driver thing, not OS. Now the OS could go through a new different power-on bootstrap sequence knowing the BIOS had some new features but Win2K would have as a fallback to the NT4 bootstrap sequence if running on a old version of the BIOS.

            And when driver open / init was called it expected WDM format but if NT4 drivers it just ran them as legacy and you did not have all the new WDM features. From what I remember power management for laptops was the big one touted at the time. I remember the WDM DDK was a total mess. I never wrote any WDM drivers, just VxD and NT (plus MacOS and Unix drivers) so the exact details are fuzzy. I know that due to WDM "features" hibernation / wake up in W2k/XP and later was something never to be fully trusted. Especially with block drivers. It was something I have always turned off on every new Win32 OS install.

            As for versioning. Portable OS/2 was the first two releases from what I remember.. Dont think if it was ever officially released but I did see it once running on a machine at a third party ISV to check OS./2 compatibility. The same team was also doing the Win 3.0 port. Now there was a nightmare story.

            The only version of NT I totally trusted and was bulletproof was NT3.51. It only crashed when you did something very very stupid and tried very hard. Not QNX bulletproof but close. But once MS put video drivers etc inside Ring 0 with NT4 and after then it crashed just as often as every other version of Windows. Because the tech leader of the team responsible for optimizing video driver support in NT4 never read the relevant Intel Pentium hardware manuals carefully.. I cant work out how to get the kernel in Ring 0 to talk to code running in other Rings and do DMA block moves etc efficiently so lets just stick everything in Ring 0. What could possible go wrong...

            Well when a driver now crashed it often brought the whole kernel down rather than just a silent restart. Like it did in NT3.51. Blame Jim Allchin. He was the one who signed off on this.

            I remember when I did the first test install of Win2K in the summer of 2000. To see if it could replace the NT4 I was using on dev machines at the time for Win32 dev. One guy on the team was crowing about the fact that his Win2k machine had not crashed once since he installed it months before. Told him - you know it now has silent reboot on BSOD by default. Turn off silent reboot and get back to me. Of course his machine had been crashing regularly . When he actually dug out the log files.

            A tradition which MS OS's continue to this day. Crashing regularly and doing silent restarts.

  19. martinusher Silver badge

    Then what happened?

    In a word, "Marketing".

    Both NT and Doom functioned and were fit for purpose. Doom itself could be, and was, improved with later versions and programs like Quake but the game ran on just about any hardware and was relatively easy to modify and so grew a huge base of add on worlds created by people from all over the world. NT was a more than adequate business and development platform.

    The cracks started appearing with things like MS's own browser and their "ActiveX" technologies. IE wasn't a significant improvement over other browsers but it brought in notions like browser lock-in for keeping users captive, 'push' technology to send unsolicited information to users (a precursor to the modern practice of renting out people's desktops for advertising) and what's now called analytics -- de facto spyware. ActiveX itself was designed as tool to promote lock-in but wasn't popular because it effectively downloaded and executed native code on a system which made huge assumptions about the target systems and opened up all sorts of security vulnerabilities. Other companies followed the lead, all trying to dominate by user capture, and the inevitable security problems and the overall unreliability and inconvenience it caused resulted in a sort of user/provider arms race. Sure, the graphics got better but all this extra capability was wasted coping with all the tricks marketers use to capture and hold our attention. Their garbage expanded to fill the processing power and network bandwidth available -- but ts never was, and still never is, enough.

    When you run 'native' programs on a PC -- that's using the PC as just a piece of hardware -- you realize just how bad things are, just how powerful our systems are. We don't use them to compute (unless we're laying out FPGAs or something like that), its all just waste. The problem we've got now is that entire national economies seem dependent on this now, we can't just stop because of the severe social and economic dislocation it would cause. (Worth a shot, maybe.....)

    1. david 12 Silver badge

      Re: Then what happened?

      IE wasn't a significant improvement over other browsers

      The rest of the world disagreed with you.

      I moved to FF because MS had a vision of an active desktop, where the browser tabs were desktop tabs. And the world then adopted the FF version of HTML. But I, like the rest of the world, could only move away from IE because I was using IE. If I hadn't been using IE, I couldn't have dropped it.

      And I was using IE 6, 5, 4, 3 because they were significant improvements over other browsers (including over each previous version).

  20. Rattus
    Megaphone

    Could it be...

    So you rightly point out that there hasn't been any big ground shaking change in the computing industry. Why is that? Is this really a surprise?

    For my penny worth I think it is this

    Computers are now able to do the job we want them to do.

    Until the late 90's most people upgraded their work PCs every few months, because they were only just about fast enough to do the job we wanted of them in the office - Word processing, data entry (and reporting), and counting beans in a spreadsheet.

    By the early 2000's there really wasn't a need to upgrade the desktop any more - it pretty much did everything we needed it to do (note I say "needed" not "wanted").

    The next boom came with laptops, and again these are largely good enough now (although better battery life would still be top of my wish list)

    then came tablets as a stop gap until mobile phones did everything people at home wanted.

    Now the mobile phone has reached a similar plateau, it does what we need it to do, anything else is just "messing around the edges".

    Games sorta push hardware still, but even that has slowed. The content of the game / the idea is what I am interested in not ever more realistic graphics - perhaps that's why retro gaming is such a big thing. And for sure, there are the hyper-scalers, cloud computes and big AI use cases, but they are the domain of a few big businesses (ok trying to sell to us mere mortals as SaaS).

    Most of today's computing "innovation" is speculative solutions looking for a problem to solve. The mass market is no longer the driving the pace of change, because [most] of it's needs have been met...

    /Rattus

    1. Yankee Doodle Doofus Bronze badge

      Re: Could it be...

      I agree with most of what you said, but...

      "Until the late 90's most people upgraded their work PCs every few months"

      Where did you spend the 1990's? It sounds like an interesting place.

      1. Zolko Silver badge

        Re: Could it be...

        with addon-cards that might be true. Like a new Soundblaster, a new DVD drive, some new RAM, new HD ... not upgraded the entire PC but bits of it

      2. Rattus

        Re: Could it be...

        Upgrade != Replace

  21. Howard Sway Silver badge

    Innovation happens in other areas beside the OS though

    Look at the increasing variety of computing devices and you'll see a lot more change. In the last 30 years, we've seen the introduction of portable computers (laptops), ultra-portables (smartphones), even wearable computers. Top prediction for the next decade? Edible computers.

    1. CGBS

      Re: Innovation happens in other areas beside the OS though

      But will the chips be sea salt and viengar, sour creme and onion, or, bleh, BBQ?

  22. Jou (Mxyzptlk) Silver badge

    The NT compatibility is amazing.

    If you have a pure Windows NT 3.5(1) program, it may just work on Windows 11. How I know?

    The Windows calculator hat that annoying behaviour to display "e-X" instead of just the number. Try with 1416/2948234. There is no solution, so I checked every calc.exe back until I found one which displays 0.000480278etc instead of that annoying e-X notation. It is calc.exe from Windows NT 3.51 (running happily in Hyper-V, including TCP/IP networking). So I copy it over to Windows 11. Still shows the right icon. I double click it. It works.

    Calc.exe from Windows NT 3.51 (german and english) works on Windows 11. Others are cardfile, freecell, sol, clock, winmine, screensavers (like logon.scr, which weirds some people out when they see that on Windows 11). Terminal and telnet seem to work, but behave weird.

    What else "just works"? Pbrush.exe. You can run the old original paintbrush from NT 3.51 on Windows 11 64 bit. ntbackup.exe works too, though it complains about a missing tape drive and I don't dare to test it.

    1. yetanotheraoc Silver badge

      Re: The NT compatibility is amazing.

      I think cardfile was only ever 16-bit. How is it working on WIndows 11?

      1. Jou (Mxyzptlk) Silver badge

        Re: The NT compatibility is amazing.

        > I think cardfile was only ever 16-bit.

        That is why I specifically mentioned Windows NT 3.51 as source :D, which is true 32 bit.

        Just tested: Seems to work good! Added several cards. Could add a pic drawn in Windows 11 paint via select-copy-paste into that old program. Long file names (albeit below the 250 characters limit) works. Is even unicode capable! I just copied some Japanese Russian Korean characters and they show up correctly, even after save-close-reopen. But it crashed when I tried to print :D.

        Event 1000, Application Error, cardfile.exe 3.51.1016.1, gdi32full.dll 10.0.22000.2538, exception c0000005 at offset 000750c7.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: The NT compatibility is amazing.

      [Author here]

      > If you have a pure Windows NT 3.5(1) program, it may just work on Windows 11.

      It goes back further.

      The Freecell game included in Win32s for Windows 3.1 still works on Win11 today. The modern Freecell is a bloated mess but the 30 year old one still works fine.

      1. Jou (Mxyzptlk) Silver badge

        Re: The NT compatibility is amazing.

        > The Freecell game included in Win32s for Windows 3.1 still works on Win11 today.

        But was Win32s not added to Win 3.11 AFTER Windows NT 3.5 was available? AFAIK Win32s came after Windows 95 release, so the date may be newer. My (German) freecell says: 1995-06-03, version 3.51.1016.1.

        --- off downloadin NT 3.1 ISO from winworldpc, unpacking and using expand.exe ---

        freecell from Windows NT 3.1 (German) works too, which is from 1993-08-13, version 3.10.497.1.

        Cannot unpack the first beta from 1991 yet, but it does not have freecell. But I see sol.exe, winmine.exe and so on in the non-conforming .ISO.

        I would have Win 3.1 as VM, but DOS and Win9x don't like Hyper-V. I have the last "does not need virtualization CPU" version of virtual PC in a Windows 7 VM with Windows 3.1, but only for testing and packed away somewhere. But currently I have no need or interest/drive to invest time. Maybe later :D. But, maybe you have a working Win9x or Win3.x for hyper-v VM?

        1. Jou (Mxyzptlk) Silver badge

          Re: The NT compatibility is amazing.

          > Cannot unpack the first NT beta from 1991

          unpacked now. Funny I still had a working DVD-RW disk around for the USB-DVD-BR-Drive, cannot remember last time I burned a CD.

          And: Nope don't work in windows 11, those .exe are probably 16 bit or some weird in-between format.

  23. Abominator

    I blame Chome in particular Electron for much that has gone wrong in the last decade.

    20 years ago, I had a fully functional and free massager app that was quick, worked reliably, took up a small amount of screen space and ran in 10MB

    Today we have Teams and similar bloated products that need half a 4k, 27 inch screen and consume north of 1GB for worse functionality and user experience.

    Web 2.0 looking back was a fucking disaster that made everyone want to embed the web in native apps and ultimately try and replace native apps.

    1. Doctor Syntax Silver badge
      Gimp

      "free massager app"

      Tell us more.

    2. ecofeco Silver badge
      Angel

      Did it also give you a happy ending?

      Jokes aside, I got what you meant. Cheers.

    3. Kurgan

      About Teams..

      I have never used it. A friend of mine has added me to their org Teams. I'm running Linux, so I have accessed it via web. I downloaded more than 80 MB of "stuff" just for it to be able to show me a message that read "test". This is our much valued "progress".

    4. 9Rune5

      I do not recall many messenger apps, 20 years ago, that would let you gather hundreds of virtual participants all streaming video and audio.

  24. Anonymous Coward
    Anonymous Coward

    Disagree about the timing when the "revolutionary" changed to "evolutionary" - it wasn't 1995. The difference between 1995 dialup and 2005 broadband was absolutely revolutionary. The processing power explosion still continued on Moore's law path, especially the newly arrived GPUs, though the slow (but still doubling in size on a regular cadence) hard drives made for diminishing returns until around 2008-2010 when the SSDs made for another performance revolution.

    Actually *using* a 1995 PC with a 500 megabyte hard drive, a CD-ROM drive that could maybe - maybe - play postcard size pixelated video, and a separate dialup modem that could be used to download perhaps one GIF per minute at great cost and effort, was a completely different paradigm compared to the always-online 2005 PC with maybe 500 *gigabytes* of MP3s, movies, and, ahem, other forms of entertainment material, with much of the world almost magically discoverable via Google when it was still *good* and apps (pthui!) had not yet taken over.

    The difference from 2005 to 2023 seems much smaller in comparison in most respects. It's just the sluggish startup times that are lost in time (fortunately)

    1. The commentard formerly known as Mister_C Silver badge

      Probably '95 or '96

      I left my PC downloading the newest Netscape while I went and watched the X files, safe in the knowledge that I wouldn't be disturbed by phone calls coz the line was in use.

      Sooo many ways that can't happen today...

      1. Doctor Syntax Silver badge

        Re: Probably '95 or '96

        You were lucky. My experience was that there'd be a call-waiting bleep that took out the link. And it was always a double glazing company. I think there was a collective sigh from the whole of the Huddersfield area when we heard those bastards had - deservedly - gone down the tubes.

      2. CGBS

        Re: Probably '95 or '96

        Trust no one. I had not planned on that being the guiding principle of life. Yet here we all are, beggars to our own demise.

  25. Boris the Cockroach Silver badge
    Windows

    Its time

    to take today's model of 'computing' and shoot it... kindest thing for it really after all its old and lived beyond its years

    Radical.... possibly, but its needed.

    I've been in computering since the ZX81, today's computers are light years ahead in comparision, but as has been pointed out, are used for web browsing, communicating(e.mail, teams etc) and work (ms office), there is nothing I can do on the laptops at work that I could'nt do on the work PC 20 years ago, and the reward for a slightly slower boot time is an OS that needs huge processing power/ memory/ telemetry.

    Its almost as if Intel et al came up with ever faster hardware, and then the OS manufacturer sat down and said "How can we make the boot/operating speed slower than before?" (in my line of business , its all about making stuff faster, if a supplier gave us a drill that could make a hole 4 seconds quicker, we'd snatch it off him and use it to make 40 000 holes faster than we could before)

    And thats why the current model needs shooting, we need to go back to basics with system design and say "What do we need the OS to do?" and then make the OS do that as fast as it can so that you're not sitting there watching a spinny circle going 'updating' or worse yet, having the desktop come up but the computer remain unusable because for some reason its thrashing the HDD to death for 30 mins (yeah I'm looking at you windows 10) anyway... whats wrong with having a shadow copy of the OS on the SSD , updating that and then just switching?(and then updating the original OS ready for the next switch)

    As for having a 96 core processor and applications slow down because they're stuck using 1 thread on one core, then you need to take the programmer out and have them shot too. multi-threading has been around for ages(I should know... part of my degree course was in making multi-threaded applications and howto avoid races/deadlocks/synchronisation issues.)

    And as a final note, I remember elite.. a galaxy in 22K, then finding out how they did it using 1 number and an algorithm(its bloody obvious for us programmers of that era.. except we never thought of it )

    1. Jou (Mxyzptlk) Silver badge

      Re: Its time

      > "How can we make the boot/operating speed slower than before?"

      Well the ACTUAL OS is booting faster and got more efficient in using the resources. Then the UI guys come along and ruin all the advancements. Especially noticeable with Windows 11, even with the widgets feature completely removed. When it comes to number crunching the performance is still the same, in some cases even a bit faster on Windows 11. But the UI is sluggish to the extend that using "CMD.exe" + "dir c:\*filename* /b /s /a" is faster than the explorer search with its "intelligent index to speed up the search".

      Edit: Let alone that the dir command does find a newly created file, whereas explorer search often does not until the search index is updated.

  26. Bump in the night
    Holmes

    " Windows 95 modernized the desktop interface, and most still work like it, despite Microsoft's concerted efforts"

    ha, good one. I would add Apple is still the leader in a consistent interface, but isn't getting much better at it either.

  27. bertkaye

    still stuck with primitive navigation and data organization tools

    The Windows operating system user interface has been prettified but the functionality is still low end crudity. A proper operating system UI would give users great power to set up customized views of file data, but even Win11 is still primitive in that regard. The VR world is developing better ways to see data.

  28. Zolko Silver badge

    like in motocross

    I remember that when I was younger, I read a lot of motocross magazines, and each end of the year they tested the new kit for next year and each time they said that the new machines blow the old ones away. Then, they decided to make a test between the current best machine and its predecessor from 10 years ago ... and they found that they were almost identical : 125cm3 liquid-cooled 2-stroke can't evolve that much once they had disk brakes

  29. Doctor Syntax Silver badge

    You can imagine the editor tearing a strip off some young reporter for having the temerity to do that. What did he think they were going to do for next year's end article?

  30. Old Man Ted

    I'm Getting Too Old

    Just Looked into my desk draw and spied 3 boxes of disks, Dr-Dos 1 to 5 plus DR-Dos instillation & utilities. MS-Dos Institution disks. Windows 3 disks, Windows 3.11 disks, Windows 95 disks, Microsoft Works disks 14 M,S Windows disks Windows 95 plus cd drive disk. There are a Whole pile of Novell disks. Assorted 5" floppies, 2 USB floppy drive disks which are in a far corner. An assortment of cd 's for Novel and NT.

    I will be 80 soon and my little boy is in his 50's.

    I use a Dos box to play all the old games he kindly left for me when he complete his 2nd BSC degree.

    There has been no improvement in the service from MS, Whilst Apple is able to read and use the 1916/7 programs on an M2 unit.

    The sooner I learn to use the newest mobile the happier I will be.Till then I am happy with my speaker phone which is loud enough for me to hear without my hearing aids.

    I regret loosing the the old units which were store in my garage which was flooded in the last cyclone a few years back.

  31. I miss PL/1

    Literally what I have been saying for years

    The last line of the piece is a direct quote of what I have been saying for years. Windows is 120 times bigger than it was 30 years ago and yet where is the 120 times the gains?

  32. sketharaman

    Great post! I've often thought about this topic myself. I've been using Windows, Word and Excel since ca. 1993. I haven't seen any sensational improvement in their functionality in the last 30 years. I once asked people on Twitter how many new features they could name in Excel during the last 30 years. Followers cited availability on mobile device and other areas of improvements that I wouldn't call functional enhancements. Then there were some people who pointed to live links to external data sources but I've used that feature to import inventory data from my company's mainframe in ca. 1994, so it's not new to me. If anything, I find it takes a few more steps to plot charts in Excel today than it did in 1993. For example, if there are two datasets, it's obvious that, in a vast majority of cases, one is X-axis data and the other is Y-axis data. The default line chart Excel 1993 plotted them that way but the default line chart in Excel 2023 does not have an X-axis at all and plots both datasets on the Y-axis, and I have to jump through a few hoops to change it to an X-Y chart.

    1. keithpeter Silver badge
      Windows

      @sketharaman

      "I haven't seen any sensational improvement in their functionality in the last 30 years."

      Perhaps not a sensational improvement, but you can have more than 256 columns and 16384 rows in Excel these days. Handy for quick and dirty monte carlo simulations.

  33. CGBS

    Change has been enormous

    I don't know, I think we have come quite a long way. We use the water and energy on par with entire nations just to steal and sell people's data. The creation of that sort of enterprise takes a lot of effort. Read the EULA or ToS the ones complicit in the scheme always like to scream. As if any of it is actually a real choice. You can always choose not to, and be unemployed on the street after all.

    At least friends don't let friends VIM. There has to be something better at this point yeah?

  34. jaffy2

    So much change, and now more security

    The joys of creating an IPX/SPX boot disk with Doom, ready for lunchtime. The game spread around the local government office I worked in at the time - one day the traffic saturated the link between two buildings! The network manager at the time deployed a shadow network in one room of coax cable. Remember showing up and connecting that pop out PCMCIA card to the BNC connector. Felt like a brave new world and the ultimate camaraderie.

    NT came along replacing WFWG and revolutionised how our network was secured, along with the bootstrap! Logging in with a ctrl + alt + delete and moving to Active Directory feels normal now, but was odd when we were used to a mega login script. Great times, generating passion for technology in the workplace in the 90s.

  35. the future is back!

    Gestures coming up - at last...

    Apple's 2024 rollout of its #VisonPro headset will bring with it a true gesture GUI. I mean there is no keyboard/mouse/pad in the "room" there. Voice and gestures will be it. About time.

  36. anonymous boring coward Silver badge

    "an analog TV set for a display was all that most home users could afford"

    Any computer monitor of that era would also have been analogue, BTW.

  37. Jumbotron64

    What about NeXT ?

    Sorry if this shows up twice ( first attempt seemed to fail )

    But what about Steve Jobs and NeXT computers ? DOOM was developed on NeXt workstations and the first App Store was developed on NexT workstations and demoed by Steve Jobs himself in 1993.

    Via Wikipedia...

    " The NeXT platform was used by Jesse Tayler at Paget Press to develop the first electronic app store, called the Electronic AppWrapper, in the early 1990s. Issue #3 was first demonstrated to Steve Jobs at NeXTWorld Expo 1993.

    Pioneering PC games Doom, Doom II, and Quake (with respective level editors) were developed by id Software on NeXT machines. Doom engine games such as Heretic, Hexen, and Strife were also developed on NeXT hardware using id's tools. "

    And to add to the notion that nothing much has changed since 1993, at least on the Apple side of things to compare and contrast to the Windows side of the aisle is the fact that the NeXT workstations morphed into OSX now called MacOS. Except for no small amount of candy coated spit and polish, the MacOS experience is a shiny happy version of using a NeXT workstation.

    1. Anonymous Coward
      Anonymous Coward

      Re: What about NeXT ? eh, no

      They were shipping DOS / MacOS games and the NextCube / Pizzaboxes were very 68k. And even though the MacOS was 68k there was no earthly reason why you would develop on a NeXT box with no MacOS API support, doggy n.x NextSTEP builds and flaky tools when you could get deeply discounted dev Macs with the same specs from MacDTS. With great stable dev tools available. Or just build your own x86 box for the price of one of the original NeXT cube optical disks.

      Maybe a few demos level editors etc was thrown together in Interface Builder to impress investors publishers etc. But apart from simple stuff you really dont want to try to build anything complex in the IB swamp.

      Even though I was not able to blag a ticket for launch in 1988 (that was a funny phone call) you just never saw the NeXT boxes in the wild unless at a trade show, in a big university like UC Berkeley or Stanford, or at some developer who NeXT were desperate enough to pay big $ to port their product to NeXTSTEP. Not that it made any difference. If it had not been for Amelios massive lapse of judgement NextSTEP would be just a minor historical footnote considering just how few seats they actually shipped in seven years. 70K tops. Boy Fujitsu lost a lot of money on the bad deal they signed. Even GO's PenPoint OS shipped more copies to end users than NextSTEP. That bad.

      MacOS X is nothing like using NeXTSTEP 3.3. The last version. MacOS X both system software and UI was always a car-crash. The NeXTSTEP 2.x /3 .x UI was pretty nice to use. But dig too deep and bad things happened. Due to OS bugs. At least it was a lot nicer to use than anything MS shipped before Win7. And a lot nicer than every version of MacOS X so far. Some of the 3.3 NS frameworks were fairly well laid out. But others almost reached TaligentOS device driver levels of deranged insanity. Which shows up later on in the MasOX with, for example, five different process API's. Three of which eventually map onto pThreads. So yeah, as big a mess internally as Win32.

      1. Jumbotron64

        Re: What about NeXT ? eh, no

        Uhh...you went a little too far down the rabbit hole there. All I meant was that along the same lines as how Windows NT in 1993 is not significantly different from Windows of today, there is a rough analog in the Mac World as well. I used a NeXT workstation in the early 90's and when OSX came out shortly after Saint Jobs returned as iMessiah of Apple I immediately thought to myself....yeah....I've seen this before.

        1. Anonymous Coward
          Anonymous Coward

          Re: What about NeXT ? eh, no..not at all

          You can (pretty much) use the same API's you used in NT 1993 with Windows 10/11 due to the code all still being in there but the UI and user experience is a totally different universe. MacOS X killed all Carbon compatibility effectively with MacOS X 10.8. The Nexties wanted it dead. After that your legacy MacOS Classic/Carbon applications may or may not work. My copy of CodeWarrior Win 98/NT4 still runs on Win 10. Its installed on this laptop. My copy of CodeWarrior MacOS 8 stopped working a long time ago on MacOS X. Kind of important when you need to fire up old dev project codebases from long ago. So you have to keep around old PowerMacs or Powerbooks to look at old code / applications for MacOS X.

          I did commercial end user application specs for NeXT 1/2 projects (abandoned as noncommercial) and shipped MacOS X applications (plus shipping MacOS software since 1984) and Rhapsody and later was a very different UI beast from NeXTStep. Very different. NeXTStep / OpenSTEP were very much in the Motif/CUA universe tradition whereas Rhapsody and later were some weird bastard child where despite decades of using GUI's (dozens) one kept running into WTF what were they thinking here moments. To those of us who actual designed and shipped UI's for big complex end user applications. Which kind of fitted in with the ultimate WTF language, Objective C.

          Oddly enough the more recent releases of MacOS X have back-tracked on most of the horrible Rhapsody / later WTF UI crap in the previous MacOS X versions UI and for the first time its actually not horrible to use. I remember the first time I powered up MacOS X 11 on my dev laptop, wow, it doesnt totally suck with default UI settings. And 12 was even better. Whereas whenever I fired up NeXT boxes back in the early 1990's it was all very familiar and easy to use for anyone familiar with both MacOS Classic and the various Mofif/XWindows GUI's kicking around at the time.

          So no "rabbit hole" here. For those of us in the trenches shipping products for many decades. And having to spend and lot of time and effort designing and implementing complex easy to learn / navigate UI's for shrink-wrap applications.

  38. This post has been deleted by its author

  39. ldo

    Microsoft still seems to believe that “26 drive letters ought to be enough for anybody”.

    “Windows NT redefined PC operating systems”, indeed ...

    1. Jou (Mxyzptlk) Silver badge

      Since Windows 2000 you can mount a drive into a directory too, just use the GUI.

      Whether that is possible with Windows NT 3.51 and/or 4.0 cannot say, at least the option is not exposed in the GUI and there is no diskpart.exe or mountvol.exe.

      1. ldo

        26 Drive Letters

        I don’t think that works for network volumes or hot-pluggable drives. You still need drive letters for those.

  40. TimRyan

    And then came optical connectivity...

    This article is spot on except for one critical trend. In 73 we had the beginnings of Ethernet that escalated from a shared 2Mb multi user tapped cable, and by the mid eighties was pushing 10 to 100 Mb switched connections. By the mid 90's 1G switched Ethernet began to crawl out of the labs with a 100 meter reach.

    The pivot point was when the signal stream became optical instead of electromagnetic. Network reach became massively extended in buildings with multi-mode fiber and to tens and hundreds of Kilometers with single mode, and network capacities grew in tenfold jumps in the late 90's and early 00's. The next roll will be optical processing embedded on the substrate connected to Pb backbones.

  41. Roo
    Windows

    Workstations, remember them ?

    Fair enough from the point of view of the PC innovation may have stopped in the mid 90s, but from the PoV of folks using workstations mid-90s PCs were copying machines that were sitting on their desks in the mid-80s. That said it's pretty cool how the bandwidth and capacity of PCs has ramped up - no complaints from me about that (and kudos for only being ~8 years behind workstations for 64 bit CPUs).

  42. mpi Silver badge

    > Since then… well, what big advances can you name?

    How about the fact that I can ask a computer to create an oil painting of a really classy penguin wearing a top hat, or instruct it to write a sea shanty about a sad C++ programmer, and it will do that? Or feed it a 4 page article and ask it to summarize that in a list of bulletpoints, but in JSON format? Or that I can use that same system to rubber-duck my code to, while writing the code, by a direct binding in my editor? And the fact that I can do that on consumer hardware, that is sitting on my desk, and cost less than a months payment to procure, using code and data I can freely download from the internet?

    I'd call that a pretty revolutionary advancement.

  43. Anonymous Coward
    Holmes

    More of the same ..

    The tech of three decades ago .. how much has changed: it's how fast it was, and is, changing.

    The devices have gotten smaller .. but not any more secure. An email attachment or clicking on a malicous URL is still dangerous.

  44. AbortRetryFail

    "We tried so hard and got so far, but in the end, it doesn't even matter"

    (To misquote Linkin Park)

  45. Anonymous Coward
    Anonymous Coward

    You are missing Linux (unless you are talking about games in PC industry only). Even this site is not running on NT derivative, let alone phones and smart watches.

    Virtualization which is brought to the masses - Doom can run even on a fridgerator.

    Containers

    Cloud

    How about SSDs - that is huge.

    How about LCD screens 4K (good luck with your crappy CRT or low res TFT back then).

    How about 10Gbit ethernet at home- compared to token ring and 10 Mb coaxes.

    Heck we have 1 Gbit internet now - hence streaming everything

    10-15-20 TB hard disks (not MB).

    See where USB/SD cards flash drives came.

    Photo cameras are dead.

    Video calls everywhere

    Thanks to all - we work from home.

    Nobody needs newspapers, Cinemas, even paper books

    I don't have land line anymore.

    I have internet in my car and I watch movies there and I don't go to gas stations anymore (I have gas station in my home). My car computer has more power and storage than data centre in 1993.

    Software defined networks. Software defined cars.

    Crypto

    AI

    If you think only about Microsoft, yes indeed we are nowhere - even Notepad is broken on W11.

    1. Mimisss

      But..., but... it has a status bar to show the current row and column now!

      1. Jou (Mxyzptlk) Silver badge

        You forgot to mention tabs and unicode and can handle CR/CRLF/LF now correctly and convert between them. And paint can do layers now! But my pet bug about broken snapshot/shadowcopy access for local drives is not yet fixed in Win 11. I hope for Q1 2024.

  46. Roo
    Coat

    UNIX as a cut down MULTICS...

    As you allude in the article, the approaches and goals of the teams were polar opposites, case in point the MULTICS team designed an OS and then had some hardware built to support it, whereas the UNIX team scrounged up some low-end hardware and cobbled together some tools & an OS to run on it. :)

    Therefore I don't think that it can be honestly claimed that UNIX is a cut down MULTICS. IMO it would be more correct (and fairer) to say that UNIX was a text-processing orientated OS that borrowed some concepts from MULTICS (see https://multicians.org for fabulous MULTICS resource). I do agree that UNIXen have accumulated a lot of bloat + warts over the years, pushing way beyond the boundaries of good taste and good design, but I'd argue that's a (perverse) consequence of a sound bunch of high-level abstractions being chosen in the first place (ie: very good design). At the end of the day I have been able to get work done in pretty much every application domain I've tackled using a UNIX-like OS (if not an actual UNIX) - some of which predated me attending secondary school - and I'm very grateful that I haven't had to relearn basic stuff like how to construct a path to a file every time I switched hardware or OS (case in point VMS was "unique", Windows 3.x -> 95 -> NT had kinks for each shift in exec/kernel).

    Running everything from inside a LISP interpreter clearly rocked some folks' boats, but it clearly didn't rock enough people's boats to be a "thing", personally I think that's a fair result. You think decoding some rando's C++ is bad, try decoding LISP crafted in a heavily customized and continually changing environment for a laugh...

    I'll get my coat, it's the one with a copy of "Transputer Instruction Set (C) 1988" in the pocket. :)

  47. NorthwestEagle

    With all due respect, I feel like this article does a disservice to the vast array of innovation that's happened over the course of the decades.

    As a child of the 80s, I was there to experience firsthand the world evolve and turn into something unimaginable outside of scifi.

    In 1993, I was a kid. The Super Nintendo was the peak of entertainment and a decent TV was around 25 inches, weighed fifty pounds and still cost hundreds. Hardly anyone had a cell phone, and the ones that were out there were damned expensive and did very little except play a mean game of Snake.

    The Internet was a thing the vast majority of us were completely unfamiliar with. Those of us that had an online service like AOL and CompuServe did it from dedicated machines plugged into a landline phone. If you had a 56k modern you were at the top of the heap.

    Futuristic shows like Star Trek had us controlling our devices with our voices.

    Today we live in a world where we can control every device in our homes with a word. We carry around the Internet in our *pocket*, on devices more powerful than the supercomputers of 93 could dream about. We hang 75 inch televisions on our wall, and we explore virtual game worlds vast and innumerable. With even inexpensive hardware we can create and share our works and explore our world. We have constantly updating maps of wherever we might want to go right at our fingertips.

    Think about how many people are reading this article at this very moment. How are they doing it?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like