back to article You can run Windows 11 on just 200MB of RAM – but should you?

Stripping down operating systems to run on old, forgotten, or low-end hardware is nothing new, and has spawned a whole community of Linux aficionados. But what about full-fat Windows 11? For that, Microsoft says you'll need a 1GHz processor with at least two cores, 4GB of RAM and at least 64GB of storage. Or do you? Last week …

  1. 45RPM Silver badge

    I have something like six 3.2GHz Xeon cores, 16GB RAM (on my never used except for testing HP Z800, hence the uncertainty about the exact spec) - and Windows 11 won’t run. So Microsoft’s minimum specs are something of a fantasy anyway.

    1. NoneSuch Silver badge

      Microsoft dfoes not care if you can run their software... long as they get their monthly subscription payment. Very much like Goodfellas.

      Business bad, pay me.

      On vacation, pay me.

      House burned down, pay me.

      Electric grid hit by lighting and no power for weeks, f*** you pay me.

      1. Benegesserict Cumbersomberbatch Silver badge

        Re: Microsoft dfoes not care if you can run their software...

        MS: the rent you pay on something you already own.

      2. Anonymous Coward
        Anonymous Coward

        Re: Microsoft dfoes not care if you can run their software...

        Sounds like we need Rom to form a union.

        1. Roger 11

          Re: Microsoft dfoes not care if you can run their software...

          Excel for Ferengi cook the books by itself!

      3. Stuart Castle Silver badge

        Re: Microsoft dfoes not care if you can run their software...

        Don't like to defend Microsoft, and I don't like subscription software, but they haven't (yet) moved Window to a subscription model, and that argument can be made about any company offering a subscription. They don't give a rats whether you use their software or not, as long as you pay up.

        That's why I don't like the subscription model. To easy to forget what you have subscriptions for, and find you are paying a subscription on a piece of software you stopped using years ago..

        1. tin 2

          Re: Microsoft dfoes not care if you can run their software...

          Windows has always been a subscription. That's why every few years the OS you paid for is suddenly absolutely definitely full of holes, unsupportable and you need the new one.

    2. karlkarl Silver badge

      It does run on a Z800 (I also use a HP Z420 and HP Z400).

      But you need to bypass their artificial restrictions that they are paid to put in by OEM vendors.

    3. katrinab Silver badge

      I'm guessing it is missing a TPM2 chip?

      That is relatively easy to bypass by changing some registry keys during the install process.

      Likewise if your Xeon is something older than Coffee Lake. Ivy Bridge will for sure run absolutely no problems once you do this.

    4. Hubert Cumberdale Silver badge

      "You can run Windows 11 on just 200MB of RAM – but should you?"

      FTFY, and, no.

    5. Roland6 Silver badge

      But does W10 x64 run on it from a clean install?

      I had a dual xeon system of similar vintage (Fujitsu CELSIUS R650) which ran XP x64, but because of some motherboard chipset and driver issues it couldn't be upgraded to W7 x64 and thus a W10 x64 upgrade was not an option.

      This hassle, the fan noise and the weight and size of the cabinet, were the reasons I finally scrapped it just before Christmas 2022.

      1. Jou (Mxyzptlk) Silver badge

        > Fujitsu CELSIUS R650

        The Datasheet says: Vista 64 supported. So Windows 7 should have worked.

        But with 2*Core2duo or 2*Core2quad level Xeons those were already slow in ~ 2012, where Sandy-Bridge Xeons showed up with their big step in computing power. Today a a-bit-over-100€ CPU, AMD or Intel, dances circles around it. Include the smallest available Nvidia or AMD GPU (not used), does the same compared to the old GFX cards which were available.

        1. Roland6 Silver badge

          >The Datasheet says: Vista 64 supported. So Windows 7 should have worked.

          Yes W7 32-bit...

          There was some information on the Fujitsu website which detailed the compatibility issues.

          If I remember correctly my main problem was the SAS RAID controller driver.

          Given I had built a (quiet) system based on an MSI X570 motherboard and Ryzen 5 5600x nearly two years back and hadn't touched the R650 since, it seemed best to accept that it was a project I wasn't going to complete.

          1. Jou (Mxyzptlk) Silver badge

            Ah, that problem again... It is due to the weird design to combine LSI chip for the SATA/SAS connection and plug a half LSI RAID logic on top of it, which actually is fully implemented in the driver. The BIOS routine is just enough to load the boot files and the RAID driver, and after that it is actually software-RAID, but not before. Tricky combination. Newer versions of that trick manage even storage tiering + raid5 for the boot drive that way in software, quite a feat actually. If I remember right you could disable all that RAID stuff completely and have pure access to the drives. But in the end you'd have a machine with third the speed of your Ryzen or 1/6th speed, depending on which CPUs you had in there. With a huge footprint on your electricity bill. And with slower PCI Express. And with slower drives, if you run on NVME now. And louder, as you mentioned.

            1. Roland6 Silver badge

              Yes it would have been about a third slower CPU-wise. Which, before the AMD Zen3 CPU's was reasonable, given the cost of significantly improving performance with new Intel. However, as you note the system still wasn't really up to modern video editing (ie. could do HD but not 4K) where a reasonable AMD CPU coupled with full-speed DDR4-3200 memory, PCI 4, NVME HD and a modern Davinci compatible graphics card shows its worth.

              However, it really was the noise - everyone in the house knew the workstation was running, the new whisper quiet workstation, permits me to work when everyone else is in bed... :)

  2. Pascal Monett Silver badge

    16GB of RAM is the minimum

    Sure, you can run on less, but if you want to run Windows comfortably, you need at least 16GB.

    And RAM isn't that expensive any more, so there's really no excuse.

    1. wolfetone Silver badge

      Re: 16GB of RAM is the minimum

      Can confirm. We've a few 8GB laptops that struggle to do a days work on Windows 10. Put Windows 11 on one of them and it was bricked basically.

      Yet here I am, typing this on a laptop with 32GB RAM running Linux Mint, and I don't even go above 10GB.

      1. werdsmith Silver badge

        Re: 16GB of RAM is the minimum

        Win11 on an 8GB laptop works just fine here.

        1. Mark #255

          Re: 16GB of RAM is the minimum

          I guess it depends what else is installed.

          I got an upgrade last year, from a dual core 7th gen i5 with 8GB of ram, which had become unusable due to the amount of enterprise crud that the IT department had larded onto it.

          Once it had been released to me, a fresh install of Windows 10 without 14 different enterprise-grade pRoTeCt_ThE_CoMpUtEr applications running meant that it's still surprisingly usable.

        2. Roland6 Silver badge

          Re: 16GB of RAM is the minimum

          But have you tried it with same laptop but with 16GB (preferably 2x8GB)?

          Yes W10/W11 does run in 8GB, I have found running MS Office (desktop), Chrome etc., Teams, Zoom etc things seem to be a little smoother and snappier with 16GB.

        3. 43300 Silver badge

          Re: 16GB of RAM is the minimum

          Indeed - depends what they are used for. If it's just basic office use with no massive spreadsheets then 8GB is adequate on both W10 and W11.

          The bigger factor is the drive - we put SSDs in some of the older machines, replacing spinning-rust, and it had a massive impact on speed.

      2. Totally not a Cylon

        Re: 16GB of RAM is the minimum

        And yet my 2013 MacBook with only 8GB comfortably runs the latest version of MacOS......

        Takes a bit longer to boot due to having to boot Opencore fast but once booted runs pretty quick.....

        Penguin icon because no Apple icon and mac is almost unix......

        1. Anonymous Coward
          Anonymous Coward

          Re: 16GB of RAM is the minimum

          strangely enough, my son's 2014 macbook with only 8gb, also comfortably runs Widndows10 (don't ask me why, but I already know why, something to do with some game that won't run off macos. Probably.

          p.s. no, it's not porn, ANY os can run porn!

        2. Roland6 Silver badge

          Re: 16GB of RAM is the minimum

          >And yet my 2013 MacBook with only 8GB comfortably runs the latest version of MacOS


          A friend has been unable to perform the upgrade due to a lack of disk space caused by the amount of cruft MacOS has tucked away in the caches - obviously the caches aren't pure temp files as they also include installed application updates such as Garmin maps...

          1. MrBanana

            Re: 16GB of RAM is the minimum

            I had the space issue upgrading a 2015 Macbook pro to Monterey. The first claim to needing 12GB of space is quickly upgraded to, oh that was just the installer download, so sorry you'll need another 20Gb to do the actual upgrade. Thanks. At least that is working space, the end result is much the same as before. Just a hassle to backup user files, delete, upgrade, then restore. Lesson learned- don't buy a Macbook with only 128GB.

            On the other hand (another 2015 Macbook) upgrading to Kubuntu 22.04 required hardly any extra space at all.

        3. katrinab Silver badge

          Re: 16GB of RAM is the minimum

          We do have an Apple icon. I am using it here.

        4. cookieMonster Silver badge
          Thumb Up

          Re: 16GB of RAM is the minimum

          Er, it is unix

      3. David-M

        Re: 16GB of RAM is the minimum

        I use a very old laptop with 4Gb RAM and W10 (64bit) runs very smoothly with 2.5 Gb free to start.

        The killers on it are browsers and LibreOffice.

        Older versions of MS Office up to 2003 all of which I use, are ultra-light on resources and very fast, they were engineered very well in that respect.

        Other programs you just choose things like foobar2000 and other functional but low-resource programs and the memory usage is all kept low.


      4. phuzz Silver badge

        Re: 16GB of RAM is the minimum

        Depends what you're doing with the OS.

        Mint runs and boots fine on 4GB of RAM, but as soon as you start using a modern web browser you'll want at least 8GB.

        1. Prst. V.Jeltz Silver badge

          Re: 16GB of RAM is the minimum

          My "behind the tv" PC runs fine on 4gb

          (cheap mini thing the size of a cig packet )

          Seems to run youtube and play amazon videos just fine through chrome.

          also hd video

    2. Anonymous Coward
      Anonymous Coward

      Re: 16GB of RAM is the minimum

      Just checked, my work PC has 8GB and has been fine. I didn't even notice it wasn't 16.

      Storage is NVME though which may help.

      So basically calling this post nonsense.

      1. AMBxx Silver badge

        Re: 16GB of RAM is the minimum

        Another me too. Old Dell laptop with 8GB RAM, runs fine.

        I've found the biggest problem is if you only have a single CPU core. Default when creating a VM is 1 core, hence finding the problem. 2nd core and it's fine.

    3. big_D Silver badge

      Re: 16GB of RAM is the minimum

      We have mainly 4GB desktop PCs and 8GB laptops. Generally they work very well. Only when doing 5-way+ Teams video do they struggle. But Teams is a bloated mess anyway.

      1. Captain Scarlet

        Re: 16GB of RAM is the minimum

        Can confirm before having Teams slung on our machines 4GB with Symantec AV and Dell Redcloak were fine for business use.

    4. 9Rune5

      Re: 16GB of RAM is the minimum

      Clearly YMMV.

      For all I know you could be running 3rd party AV in addition to Windows Defender. Or you're running one or more VMs (or WSL2). Perhaps you have thousands of browser tabs active?

      "Comfortably" can mean so many things.

      The average user is perhaps content with a single web page, one or two MSOffice apps and Solitaire. For that I suspect 8GB is still enough.

    5. CommonBloke

      Just because it's cheap

      Doesn't mean you should be using and rolling over all of the available resource

      1. Anonymous Coward
        Anonymous Coward

        Re: Just because it's cheap

        If you have free RAM, Windows will try and use it to speed things up. Pull up task manager and look for "Cached" on the memory tab. This is memory that isn't being actively used, so stuff gets cached there. If required for applications, it will be released.

    6. Ken Hagan Gold badge

      Re: 16GB of RAM is the minimum

      It really depends what you are running. I've run various versions of Windows in VMs over the years and found that they are perfectly usable in about half the minimum recommended memory. Then again, the heaviest workload I tend to run on those is compiling software. Open up a modern web browser and they start to slow down. Open up Windows Update and they just fall over.

      1. FirstTangoInParis Bronze badge

        Re: 16GB of RAM is the minimum

        Yeah that’s my experience too. The point at which I upgraded to 16 GB on a friends desktop was when it was taking forever to do Windows Updates. Office is definitely bloated but WU says “hold my beer”.

      2. David Hicklin Bronze badge

        Re: 16GB of RAM is the minimum

        It also depends what crudware a corporate entity puts on the machine, my windows 10 work laptop gets to 6GB in use by the time I get logged into it, it started with 8Gb so an extra 8GB helped no end

        Win7 before it was never that greedy !

    7. An_Old_Dog Silver badge

      It's Not About New PCs

      It's not about buying a new PC/laptop with small amounts of RAM.

      It's about continuing to use older PCs/laptops which can not accept large amounts of RAM due to RAM-slot limits, and/or BIOS limits, and/or CPU limits.

      (Icon for my old kit)

      1. Roland6 Silver badge

        Re: It's Not About New PCs

        >It's not about buying a new PC/laptop with small amounts of RAM.

        However, given MS and Windows Starter edition...

        This looks like a nice way for MS to enlarge the W11 machine base, if they were to purchase Tiny11 off NTDEV, without actually changing the currently shipping W11 build.

        Mind you suspect many would prefer a smaller and more streamlined OS footprint, provided they could also install .NET framework, AD domains, Hyper-V ...

  3. chivo243 Silver badge

    I ran this Tiny11 in a vm, with the 2gb, it ran, but without activation ya can't change the flippin background, and I couldn't look at that default 11 wallpaper. So, there it sits, installed and abandoned.

    1. that one in the corner Silver badge

      > I couldn't look at that default 11 wallpaper. So, there it sits, installed and abandoned.

      Dare I suggest opening an application and using that to cover up the wallpaper? How much time does anyone spend with the wallpaper visible anyway?

      Okay, your 2GB won't be enough for anything that really requires some oomph, like The Modern Web Experience, but you can fire up a sensible code editor!

      1. chivo243 Silver badge

        after seeing this...

        I was very afraid to open any apps.

    2. Jou (Mxyzptlk) Silver badge

      Oh, just change the registry entries. You can change it all, just not using the GUI methods.

  4. MatthewSt

    WinSxS != Windows Store

    WinSxS is the Windows _Component_ Store, where it stores various versions of Dlls after system updates (the SxS stands for Side-by-Side). It's been around since Windows 7 -

  5. big_D Silver badge
    Thumb Up

    Now, there's...

    an Ian Betteridge's Law headline, if ever I saw one! :-D

  6. Dan 55 Silver badge

    How did we get here?

    What has caused so much bloat that we've gone from operating systems which need 16MB of memory to be usable (I'm thinking of Windows NT here so it's a sort of like-for-like comparison with the Windows of today, other operating systems comfortably ran with less) to operating systems which need 16GB of memory to be usable?

    I can't point at one specific feature and I can't say that we're doing much more complicated stuff than we were doing 25 years ago either.

    1. Anonymous Coward
      Anonymous Coward

      Re: How did we get here?

      It's only my perception of course, but I find that the factor that mostly accounts for deterioration in performance over time is the amount of memory and cpu time that is required just to display stuff.

      Browsers are getting more demanding and desktop environments seem to need more and more to get a smooth experience (I know many can be tweaked, but that shouldn't be necessary).

      The presumption that memory is cheap, therefore there's no excuse to run with < 8GB (eg) doesn't help.

      The above is all from the Linux perspective. How Microsoft have managed to make it so bad with Windows 11 vs What Windows 10 used to demand, I don't know.

      1. Andy The Hat Silver badge

        Re: How did we get here?

        "Browsers are getting more demanding"

        And when was a browser part of the OS? I know it isn't because under competition law MS allow alternative browsers to be installed - don't they ...?

        I do think you are correct - there are developer or bean-counter perceived basic "requirements" for services that actively build the bloat - prettiness, helpful features, telemetry, more telemetry, update checking, licence checks, mandatory connection remote services etc etc, none of which are generally driven by the user.

        There are two types of os - quick, small and modular that calls in services as required by the user and fat, bloated, monolithic systems that have everything built in. The former allows easy development of third party modules, customisation and extended longevity of the system. The latter gives the developer more (or total) control over their system and it's longevity based on updates applied to any part of the system and thus control of the income stream it generates and the push development of hardware on which it will run. Guess which direction the monopolists have decided to go?

        1. Richard 12 Silver badge

          Re: How did we get here?

          Almost everything in that post is wrong.

          Monolithic is actually faster for real workloads, and doesn't preclude third party modules either.

          Both Linux and Windows are monolithic and support third party modules. Linux even has explicit options as to which modules to compile into the monolith and which to keep separate.

          I'm reasonably sure there aren't any general-purpose microkernels left, actually.

    2. mark l 2 Silver badge

      Re: How did we get here?

      it might sound impressive to run Windows 11 with those RAM requirements, but you could run Windows 2000 in 256MB of RAM and taking up less than 1GB of disk space. And I would argue that the Windows user experience hasn't vastly improve since then. In fact its worse in some aspects such as data slurping and forced feature updates

    3. tony72

      Re: How did we get here?

      Just like how no matter how many cupboards and shelves you have at home, you never run out of stuff to put in/on them, an OS will expand according to the resources available. All the abstraction layers and frameworks and libraries that come with a modern OS are arguably unnecessary, and you could build a much smaller OS without them, but they make things much easier for developers, and it doesn't make sense to not have them when the resources needed for them are there in every PC.

      1. Dan 55 Silver badge

        Re: How did we get here?

        This is the chicken and egg problem, which expanded first?

        We also saw it happening with mobile phones - Symbian was pretty resource efficient and hardware specs were quite low yet good enough. Then Android and iOS came along and hardware specs went up as mobile OSes bloated. It took a while to find a new happy medium and there were plenty of grim sluggish Androids around for a while. Android started out requiring 32MB of storage now it needs 15-20GB more (60GB more if you're Samsung).

    4. Filippo Silver badge

      Re: How did we get here?

      You can't point at one specific feature because it's death by a thousand cuts.

      Better graphics. Alpha channel, color management, even just a higher resolution. It all has a footprint.

      More layers of abstractions, making things easier for devs at the cost of a performance hit. Yeah, we all should make efficient code, but the truth is that dev time is a constrained resource, just like RAM and CPU. I have some features in my programs these days that I simply could not have economically implemented back then. You should not go overboard, but making tradeoffs on that front isn't unreasonable in itself.

      The need to retain backwards compatibility while adding support for newer hardware - these days I can print via Bluetooth... but I can also still print via a parallel port. That has a footprint. Sure, you can load modules on-demand... but module management also has a footprint.

      Integration of features that used to be outside the OS. WinNT didn't have an antivirus, or full-disk encryption, or backups, or... much more stuff. Okay, we could argue that there was no good reason to bake these things into the OS, but I can't help remembering with dread how many problems I had with getting this sort of low-level third-party tools working properly and reliably. There may be some merit there. And, again, you can load things dynamically, but the hooks have to be there, and they have a footprint.

      Handling of corner cases. You find that some bit of the OS craps itself in some exotic conditions, and it's probably not even your fault; it's a third party driver, or the user has fucked up something, whatever. Back then, you didn't care - there weren't that many different bits of hardware, and there weren't that many users; even if corner cases existed, they'd only pop up very rarely. Now? Now, if you have a case that happens once in a billion, it means that it's happening all the time. You need to add the logic to handle it. That has a footprint.

      Automated configuration of stuff. Back then, I had to tell the OS exactly what IRQ they had to use for a bit of hardware. Not today. And the same goes for all kinds of stuff. This is awesome, but it has a footprint.

      Security. OSes these days have to validate and revalidate all kinds of crap, just because someone might have slipped a nasty through god-knows-what side channel. WinNT didn't do that, because there weren't that many nasties, nor that many channels. But that has a footprint.

      And then, and I suspect this is by far the main culprit... thousands upon thousands of tiny things that individually occupy very little. If I add a feature that requires 70kb of RAM, but I could make it run in 7kb if I spent a month optimizing it... well, my employer pays my wage, but the customer pays the RAM I use. They won't notice that amount of extra usage, and even if they did, they could never prove I could have avoided it. What do you think is going to happen? Multiply this by thousands and thousands, for decades.

      1. ChrisC Silver badge

        Re: How did we get here?

        "Automated configuration of stuff. Back then, I had to tell the OS exactly what IRQ they had to use for a bit of hardware. Not today. And the same goes for all kinds of stuff. This is awesome, but it has a footprint."

        Back then (as in mid-late 80's, long before I got my first Wintel box), I didn't have to tell the OS any of that stuff, because the underlying hardware was elegantly designed to deal with it all for me. Back then I also didn't have to coerce applications to play nicely with one another within a co-operative multitasking environment, because the underlying OS was elegantly designed to provide pre-emptive multitasking. When you design the system correctly in the first place, you can do a lot of really neat stuff with barely any resources.

        Wintel's dominance of the personal computer market has a LOT to answer for, given how much genuinely ground-breaking tech it managed to kill off, only to then, badly, reinvent a decade or so later...

        1. Dan 55 Silver badge

          Re: How did we get here?

          It's at this point I think that a link to The Thirty Million Line Problem is appropriate because it does argue that theory, that badly designed systems made software bloat.

    5. Charlie Clark Silver badge

      Re: How did we get here?

      Modern screen resolutions require more RAM than good old VGA, the registry keeps getting bigger but I suspect the biggest culprit will be the .NET runtime. When things are done properly, this should allow applications to do more with less code and do it more safely. But all that managed code imposes its own overhead.

      On the disk drivers and the ever expanding list of sundry utilities and languages all demand more space.

    6. Anonymous Coward
      Anonymous Coward

      Re: How did we get here?

      How did we get here?

      Simple: It's the "tick" and "tock" between Intell(sic) and Microshaft (sic) boosts CPU performance by newer tech/faster speed/more cache and the other brings out a new OS version that says "thanks I'll take all that extra processing power" and leaves nothing for applications to run...which then forces you to buy more RAM, only for the new OS to say, "ah, I just needed that to help my OWN performance" and again we're back to square one with all the hardware improvements being absorbed by the crappy OS "upgrades".

      Of course it's always been like this, when we went from a simple CLI OS (like DOS 3.3 or DOS 5) to a graphical interface (Win 3.x, 95, 98, XP, etc etc) and finding we (as consumers) have to keep adhering to the typical PC upgrade process to keep all these hardware and software manufacturers in business.

      PS: I still have a really nice PC running XP and another 2 running Win 7 - all work perfectly well and no need to upgrade them (they are not connected to the net so quite safe). And they do what they need to do as some ancilliary hardware connected to each of them do not have newer "drivers" available to run with a newer OS.

    7. MrBanana

      Re: How did we get here?

      I can't point to a specific feature, but my finger is wagging in the direction of developers and the love of packaged bloatware to incorporate into their application. Need to compare two character strings? Lets grab 200MB of 3rd party code, from a random github repository, that can compare anything to anything, just because. Testing? Sure, it works fine on my local environment with a 32GB machine, high speed SSD and zero network latency. Developers should be forced to use the bare minimum processor, RAM, disk, and network resources. Eh, I remember the days when a 2MHz 6502 and 32K of memory, with a 5 1/4 floppy drive was considered extravagant.

      1. Jou (Mxyzptlk) Silver badge

        Re: How did we get here?

        2 MHz? Who gave you such a fast 6502 CPU? Faster than C64...

        1. that one in the corner Silver badge

          Re: How did we get here?

          The BBC Micro was (still is!) running a 6502 at 2 MHz.

          The BBC Master Turbo pushed its 6502-compatible CPU up to 4 MHz: made you feel like King of The World, running spreadsheets in 128 MB of RAM.

          1. druck Silver badge

            Re: How did we get here?

            That would be Kilobytes not Megabytes.

            I managed to write a full windowing system, with desktop, file manager, calculator, clock and palette changer in BBC BASIC running in MODE 1 (320x256 4 colours), although it did need a BBC Master with it's extravagant extra 20K of shadow memory for the hi-res screen. That was in the summer of 1987 while I was saving for my Archimedes 310, and once I had grasped that profligate 1MB of RAM it was all down hill all the way as far as memory efficient programming was concerned.


            For anyone using RISC OS, it still runs in a graphic task window, and is one of the examples supplied with !GraphTask.

      2. Charlie Clark Silver badge

        Re: How did we get here?

        What you're suggesting still does not apply to most desktop applications which are, thankfully, not full of untestable external dependencies.

        Any programmer who knows their stuff will tell you it's okay to use as much memory as is available and one of things OSes do is manage memory allocation so that not every application has to implement its own (shitty) system of virtual memory.

    8. FirstTangoInParis Bronze badge

      Re: How did we get here?

      I’ll see your Windows NT on 16 MB ram and raise you Solaris 1 on 4MB memory. It ran a 2.5D CAD system with programmable mouse gestures.

      1. Jou (Mxyzptlk) Silver badge

        Re: How did we get here?

        That comparison is unfair: See the price tag when those computers and OS-es were in their prime. You could by a lot of NT 4.0 boxes with 16 MB RAM for what that singe Solaris box cost.

        1. Charlie Clark Silver badge

          Re: How did we get here?

          The comparison is more than reasonable: NT was a memory hog with awful context switching as BeOS ont the same hardware illustrated.

          1. Sandtitz Silver badge

            Re: How did we get here?

            "NT was a memory hog with awful context switching as BeOS ont the same hardware illustrated

            BeOS had zero backwards compatibility and it was a single user OS anyway without security features.

            1. Charlie Clark Silver badge

              Re: How did we get here?

              What kind of backwards compatability should it have? I never made claims to its security features, just an illustration that it was possible even on x86 to have much more repsonsive systems than Windows NT.

              1. Sandtitz Silver badge

                Re: How did we get here?

                "What kind of backwards compatability should it have?"

                None. It is of course possible to have much more responsive system than your standard Linux distro (or Windows) if you trim all the necessities and focus on response time.

                DOS w/ Desqview was even snappier than BeOS and that's with just a 386 and 1 meg of RAM. Whoop-de-doo.

    9. An_Old_Dog Silver badge

      Re: How did we get here?

      Enough guppies can eat a whale.

      The number of craplications and craplets which gather way-too-much-data and metaphorically phone it back home have grown immensely. All that gathering-and-sending takes RAM and CPU cycles.

    10. LateAgain

      Re: How did we get here?

      There's also the disc use on start-up.

      Turn a windows laptop on and leave it.

      Wait for the disc activity light (wonder why you don't get them anymore) to stop being constantly ON.

      Half an hour if you are lucky.

      1. david 12 Silver badge

        Re: How did we get here?

        There's also the disc use on start-up. ... Half an hour if you are lucky.

        I found that Win10 was unusable without SSD boot drive. Using a spinning drive, my wife thought it was broken. She'd click on things and nothing would happen, so she'd click so more, and more nothing would happen. Perhaps if we'd left it for half an hour at startup it might have become more responsive: certainly 10-15 minutes wasn't enough.

        And maybe it could have been better with a different software profile, but she wasn't running third party AV, corporate bloat or unwanted downloads: just a fairly normal work machine.

    11. cookieMonster Silver badge

      Re: How did we get here?

      Basically, in a nutshell, windows is shit. As simple as that.

      640k should be enough for anybody . . .

      Grumpy old man icon, because that’s what I am now.

    12. BPontius

      Re: How did we get here?

      I am running Windows 11 Ent, and only using 5.7 Gigs out of 32 Gigs of RAM. Minimum RAM for Windows 11 is 4 Gigs and most PCs come with at least 8-16 Gigs. Where do you get 16 Gigs of RAM needed for an O/S today to be useful?

      1. Roland6 Silver badge

        Re: How did we get here?

        > Where do you get 16 Gigs of RAM needed for an O/S today to be useful?

        Suspect there is a little confusion: There is the space the OS requires for itself and then there is the additional space you need to run applications in. For many the latter is more meaningful and hence will refer to "Windows needs 16GB of memory to be usable".

        On my W10 Pro laptop with 16GB, I note on startup Windows typically reserves ~3.6GB (compressed) for itself and commits ~5GB. Open a typical mix of common business applications: Browser, Outlook, Word, Excel, Zoom and I'm up to 7.3GB. Open a few tabs, files and perhaps a PDF and you trip over 8GB...

        Yes, you could install 12GB of RAM, but 16GB is so much simpler and 2x8GB will often give a (small) performance boost.

        Hence why I regard 16GB as being reasonable.

  7. Alumoi Silver badge

    You can run Windows 11 on anything

    if you really want.

    But the most important question is: why would you run Windows 11 in the first place? OK, I can understand running Win7/10 for those impossible to avoid programs, but Windows 11? Really?

  8. Anonymous Coward
    Anonymous Coward

    What disk?

    What type of disk was used on those 200MB+ experiences? I see HDD on the picture but was it a fast SSD?

    1. Kobblestown

      Re: What disk?

      Should be a RAM disk for best experience

  9. Anonymous South African Coward Bronze badge

    I can remember the days of running NT4 SP6 on 32Mb of RAM for better performance... and it did it quite well, alhough using IDE HDD's was a serious bottleneck.

    NT4 SP6 on 32Mb RAM and on an NVMe and an i7 should give some impressive speeds, if you can get drivers to compile for WinNT...

    1. Paul Crawford Silver badge

      I remember how for NT or w2k having a SCSI disk really increased the system performance. I think SATA delivered a bit of the asynchronous access / out of order gain but it was always poorer (but cheaper) then SCSI, and had stupid "lets increase possible space by factor of 2, again" sort of compatibility issues.

      1. Kobblestown

        Yep, SCSI offerred better queueing capabilities. And SATA 1.0 didin't really have any. Sure, it did have it in theory and it was more SCSI-like with TCQ but it required controller support and those were few and far between. Also, I think WD's first Raptor was the only drive that had it. NCQ only came with SATA 2.0 IIRC.

        In any case, SCSI had lower CPU overhead. And faster spinning drives. I think only the (Velocy)Raptors ever ventured above 7200 rpm with SATA, whereas SCSI went as far as 15K! Could have been more than twice faster than a 7200 drive for random IO - not an insignifficant difference.

      2. that one in the corner Silver badge

        After Something Horrid happened to my Amiga A1000 (sob) I got a SCSI card for the Win2k box and moved the chain of hard drives over. That gave Windows a massive boost, no longer seeming to come to a complete standstill when doing something disk intensive, compared to the IDE drives.

        Plus, at the time there were so few different types of SCSI drives on the market that when one of them decided to let the magic smoke out (literally blew a hole in the its main IC) I was able to go to a Computer Faire down in Temple Meads that same weekend and easily buy a cheap second-hand one (cheap 'cos no-one else wanted SCSI, go figure) and just swap over the drives' controller boards.

  10. Tubz Silver badge

    With Windows 10/11 Microsoft should have done what Apple did when they switched from PowerPC to Intel To ARM, scrap legacy support and reduce bloat. You either keep up with the architecture, pay us lots of $$ to support your old systems or accept risks involved. Yes, consumers would be left to fend for themselves, but they have same options, pay for support if feasible, upgrade software/hardware, switch to Linux.

  11. Captain Scarlet

    DDR5 on machines

    I hate DDR5 having to be soldered on to the board, no upgrade path makes this a pain (Especially if caught out by ordering one without enough memory a few years down the line).

    8GB is pretty much standard now for a business machine, but when its not upgradeable having to pay the added expense rather than a mid life upgrade like we used to use.

  12. Anonymous Coward
    Anonymous Coward

    You can run Windows 11 – but should you?

    simple English.

  13. karlkarl Silver badge

    200mb jeez!

    Windows NT 4.0 + Office 97 + Visual Studio 6 + Borland C++Builder 5 would run in less than that.

    Plus the UI had more functionality.

    What have people been doing all these years? Sleeping?

    1. Captain Scarlet

      200mb for NT4 would have been a luxury for a client machine (Work experience I think machines had either 32MB or 64MB of ram). Not Windows ME, wouldn't run properly on 64MB/128MB of ram, but was stable when supplied the full wack 512MB.

  14. sawatts

    <cough> No one will ever need more than 640kb and two floppies </cough>

  15. Jou (Mxyzptlk) Silver badge
  16. Jou (Mxyzptlk) Silver badge

    Just uninstall the Widgets crap...

    Total removal, not just the user:

    (Get-AppxProvisionedPackage -Online).Where({$_.DisplayName -eq "MicrosoftWindows.Client.WebExperience"})[0] | Remove-AppxProvisionedPackage -Online -AllUsers

    and *bam* less RAM usage. And you can bind Windows+W to something useful.

  17. martinusher Silver badge

    Virtual Memory....of course

    Its normal to have swap space on a filesystem where virtual memory can be paged to, extending the process space. On an 'ix' system it uses a dedicated disk partition, on Windows it seems to be a hulking great file in the main filesystem. All normal stuff. If the system has too many processes for the amount of RAM then its going to spend a lot of time paging memory in and out which makes the system crawl.The 'fix' would be to rationalize what's running at any one time but in this world of cheap hardware and multicore processors the temptation is to just keep adding to the hardware resources.

    This explains why my humble computer runs Linux perfectly but tends to be a bit anemic running Win10. Its not 'the user experience', its that Windows is running far too much unnecessary crap and everything is is such a tangle that its easier to just say "it need x Gigabytes to run" than to sort this pile of spaghetti out. (Performance hits are also likely because of the continual network traffic which being web based is likely to include a lot of either busy/wait loops or frequent process swaps.)(Hence multicores -- the ideal solution to running sloppy code fast!)

  18. cmdrklarg


    Sounds more like crawling, or possibly shuffling.

  19. Paul Hovnanian Silver badge

    But should you?


    Nothing to do with hardware resources. There are just better uses for them than running Windows.

  20. Stuart Castle Silver badge

    200 Meg RAM? When I were a lad, we had 16K if we were lucky..

  21. ComicalEngineer

    Win10 on a 4GB Celeron

    Yes, I have an old Acer Revo RL80 with a 1.5GHz Celeron and 4GB of RAM. OK, it does have an SSD.

    4GB because I've been using it to run some 32 bit legacy software dating from 1998 - which it runs pretty well with the odd minor hiccup (which I can cope with). It's fine running the legacy program with Word 2010 and Excel 2010 running at the same time, or in my normal usage, Libre Office. The only thing that slows it down is RAM hungry Firefox.

    I had a spare copy of Win 10 and the Acer was a spare machine and wanted to try Win10 before moving to a daily driver. Having de-crapped Win10 is does a decent job for me and I'm quite happy with it.

    My normal Windoze desktop machine is an AMD Quad core 3.6GHz with 20GB of RAM driving two 4K monitors, but I generally prefer my Fujitsu i7 with just 8GB running Linux Mint.

    SWMBO has a Win11 laptop which she is OK with but I detest Win11, partly because of the *£%&ing taskbar abortion.

  22. Anonymous Coward
    Anonymous Coward

    There once was a man with a dream,

    To run Windows 11, it seemed extreme,

    With just 384MB of RAM,

    He thought he'd never give a damn.

    He clicked and typed with glee,

    As he watched the OS come to be,

    It loaded up in just a snap,

    He felt like he'd won the tech race.

    He opened programs left and right,

    No lag, no crashes, such a delight!

    But then he tried to play a game,

    And that's when things got rather lame.

    The graphics were choppy, slow, and rough,

    He'd have to upgrade, that was enough,

    But even with more RAM in tow,

    He still chuckled at the memory show.

    So remember, folks, with a grin,

    You don't need much to win,

    Just a little tech know-how,

    And a dash of luck, that's the wow!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like