I've got some ideas
Involving trepanning tools like the ancient Egyptians used to use and various AI CEO orifices but that would get me on a watch list.
Ballooning memory prices are forecast to kill off entry-level PCs, leading to a decline in global shipments this year - and a similar effect is going to hit smartphones. Analyst biz Gartner is projecting a drop in PC shipments of more than 10 percent during 2026, and a decline of around 8 percent for smartphones, all due to …
The 2026 smartphones, as far as I can see, won't be better, just more expensive. And since the Battery of my Xiaomi Mi 9 (the model with Qi) is reaching "about a day standby" instead of five to seven days I switched to still available Xiaomi 15 (normal, the pro/ultra are not worth the extra money).
The "new" Samsungs are showing the trend: Not much better inside, but new price tag for the same speed / RAM / Storage of the previous model. Similar for Pixels etc...
"(normal, the pro/ultra are not worth the extra money)"
They can often be pretty aggressively marked down.
I just got myself a Redmi Note 15 Pro Plus (daft name). On my contract it was €99 for a 24 month renewal (and note, the price didn't change if I didn't renew). Orange gave me an immediate €20 reduction, and by taking photos of the paperwork, Xiaomi dropped €50 into my bank account. So in all I got the mobile contract I would have had anyway, and a new phone that isn't bad for €29...
If you need a smartphone, stick with what you have in tandem with a pocket wifi unit. If you just need a phone, use a featurephone. If you need both, get a jacket with an extra pocket. Don't buy anything new until the AI bubble has burst.
Much like the situation with modern PCs has been for some time, they don't actually need to be better, as far as I can tell. The only tangible improvements of late have been in the cameras. For pretty much everything most people actually do with them, something at the entry level will probably be fine. Anything else is just bells and whistles. Or screens so big you can't fit them in your pocket.
Surely in 2026 there is an abundance of devices that could be used just a bit longer? You know, like the concept of re-use.
The main thing killing this off is batteries that can't be (easily) replaced. RAM, storage, and processing power of a phone 4-5 years old isn't the issue.
I debated Rufus for getting it into a VM, decided it was smarter to CANCEL MSDN or whatever they call it now (saving $800/yr), wait until I HAD to get 11, THEN bought the cheapest low end piece of CCP junk I could find [which turned out to be a rather nice 8GB Dell mini-PC with 11 Pro for around $130, plus the HDMI to VGA I needed to use it[.
Microsoft considers it "piracy" to run software in an unauthorized manner, even though piracy by definition requires a boat and if there's no boat involved, it's not piracy.
It would be trivial for Microsoft to make it require modifying the software to make windows 11 run on non-TPM 2.0 hardware and they could proceed to pursue any such cases of copyright infringement (permanently modifying software without copyright permission is copyright infringement).
Piracy is a different issue, I think. I don't see how piracy is tied to unsupported hardware, it should be simply tied to a lack of a licence.
My HP Z210 worked fine with the initial W11 releases, and I quite liked it, especially the Android support.
Later Android support was removed, and then it stopped updating automatically, and started getting bloated.
I installed Linux and put it in a VM side by side for a while, but I've not used it for months.
The effort required to use it is now in no way worth the reward (at home)
I suspect that is the case for many home users with supported systems too, (but most of them don't realise it)..
They could consider a special edition - as a new purchase of course, for older H/W. OTOH if the extended support wheeze brings in enough money they wouldn't want to kill that and, of course, it establishes the practice of Windows as and annual subscription, preparing the way for Windows 12.
Idly wondering if MS will suddenly realise W11 WILL run on those ‘legacy’ PCs after all.
Windows Vista all over again. Computers were released with XP on them in a short-lived reboot because of it.
Seriously enough, I just want Mozilla and Google Chromium and WebKit-based browsers to OPTIMIZE THEIR BOAT-ANCHOR INEFFICIENT BROWSER ENGINES, clean up that RIDICULOUS MEMORY CREEPAGE (especially YOU, Mozilla) *CAUSED* by GARBAGE COLLECTION with LAZY CODING [and a fried egg on top and SPAM], and stop adding "Features" UNTIL YOU FINISH FIXING WHAT IS THERE!!!
Otherwise, I will CONTINUE to BERATE YOU while HOLDING DOWN the SHIFT KEY! [and FART in your general direction]
After which we won't NEED 64GB of RAM just to read the El Reg Web site!!!
This post has been deleted by its author
recently needed to replace hardware anf get an 11 box to do taxes. All 3 are Amazon reconditioned units. One is an HP, two are Dells. all 3 have 8GB, the one for win 11 as an SSD system disk. Only problem is HP box does not like to boot drives over 4TB so I left an "ultimate boot CD" in the drive.
All were significantly under $200 each. One is my FreeBSD 15 server+gateway+dns+web+mail+storage+firewall+NAT multi-homed on a /29. One is a Linux development box. One [the smallest] has no CD but boots ok from USB [windows 11]. I'm happy with all 3 doing what I need, at least through 2030. All 3 for a total less than ~$500
.. the old "RAM Doubler" utilities from back in the day? I was about to make a joke about them making a comeback, but Google informs me that modern OSs actually use memory compression already. Apparently CPUs got so powerful that it became something worth doing from a performance standpoint, and of course the modern implementation is a long way advanced from those sometimes highly questionable utilities from back in the noughties. You learn something new every day.
Ah, Javascript. I must say the first time I saw it I was really impressed. It looked to me like someone had spent a long time looking at all past and present computer languages and had carefully combined the very worst aspects of all of them into one easy to screw up package that made VBA look benign.
Even weak mobile CPUs, that are RAM starved, if a reasonably fast compression algorithm is used, there is a significant speedup, as paging to swap is very slow.
Even compressed swap is usually faster, as reading less bytes from slow storage is faster.
Modern implementations are little different besides an improved compression algorithm and there no longer being an assumption that memory is always compressed at a 2:1 ratio or better (meaning there is no wraparound that wipes out the memory).
"Meanwhile Zorin or Mint will keep that old laptop running with an up-to-date OS for quite some time longer."
Whilst I agree with you, the problem with this standpoint is it's best suited to those technically minded / average Reg readers.
The moment you give someone an old laptop with an unfamiliar OS installed they might use it... up until the point where they can't do something that they were used to on their old Windows or macOS machine. As soon as that happens their stance becomes yeah I need to go and buy a new (Windows or Mac) laptop.
To put it bluntly: the average person doesn't care about their OS, unless or until it changes to one they're unfamiliar with!
The OS's job is to let you run the apps you want/need/have/are forced to run and get out of the way.
If said apps are not available in said OSm the OS is useless. This is not an statement avout the quality and technical merits of said OS, this is just the way the world works.
If you want/need/have/are forced to use FinalCutPro (say, because the customer requests it), then linux or windows will do you no good.
If you want/need/have/are forced to use excel (say, because there are some macros in the spreadsheet your emplotyer use that do not play nice with libre office) then Linux will do you no good.
I use linux for donated laptops, and is very good that that. But is no panacea. And for some people, free hardware with linux on top is of no use
The OS's function is to run applications. The applications' functions are to allow the user to do stuff. As you, williamyf, know quite well as you've told us of your excellent work in using Linux to make computing available to those who would not be able to afford it, it the being able to do stuff that's important.
I'd also tell you about the time I was a contractor for Huawei (2012-2016). The spreadsheets with timesheets for students, exam results, as well as my expense reports had macros that did not play nice with libreoffice.
Word documents (exams, and technical documentation) and ppts that Huawei provided that i had to edit or translate to spanish had their formating borked by LO.
Can i do a presentation in LO? As a matter of fact, during that time I also teached at the Uni (in venezuela, so salary was junk), and all the paperwork and presentations I did in LO + YED, so, yes, I could "do stuff". But I need office to operate on the documents Huawei insist I use.
My options at the time: refuse to work with Huawei for not being open enough and not put food on my table (salaries in venezuela were crap at the time, so the esporadic work with huawei in mexico, colombia and brazil was what really paid the bills). Try to convince the whole of Huawei-Training division to migrate to LO (good luck with that). Use LO to maintain my FOSS purity and do a lot of extra and unpaid work reformating stuff, and take extra risk by re-implementing the macros, or, you know, use windows/macOS + office.
Well, macOS + Office it was!
I though i gave clear enough examples in the OG comment. But I overestimated a part of the audience.
"To put it bluntly: the average person doesn't care about their OS, unless or until it changes to one they're unfamiliar with!"
OK, let's assume this average person has a W10 laptop that can't be upgraded.
Would the Zorin GUI on their old laptop be less familiar than that on a new W11 laptop?
And which would cost them money?
"To put it bluntly: the average person doesn't care about their OS, unless or until it changes to one they're unfamiliar with!"
Well, let them spend the thousands while you buy a new car instead.
Clients don't pay me to run the latest shiny. They pay me for work delivered. The antiques in my office get the job done and the new car is getting much closer (used, actually).
Given that my (literal) technophobe Mum (in her 80s) is coping fine with Mint, (after her Mac lost support) I think the only people that applies to are those that have too much money. (Not as many now with budget devices not being so cheap any more)
I replaced Vista on a laptop quite some time ago with Devuan Linux when the system had slowed to a crawl so much and was so virus prone it just had to go.
The user (a relative who was not all that computer savvy) learned it in around 30 minutes, happily went back to the usual browsing, e-mail, watching videos, etc.. She still uses that laptop to this day. Recently updated it with Devuan 'D' something 'cause of the recent (irritating cert-related) need to update Firefox.
"happily went back to the usual browsing, e-mail, watching videos, etc.. "
My mum is the same. Her computing needs are mainly a comm interface rather than work related things. She has no need to integrate into a workplace environment with M$ crap. LibreOffice is more than sufficient on the occasion she needs an office app. The rest of the time it's Web and eMail.
> up until the point where they can't do something that they were used to on their old Windows or macOS machine.
I have started the migration to Linux when Micro$lop decided on a whim that my Windows 10 was no longer activated and no code input I tried would be accepted, so I gave them the middle finger at that point.
I have been pleasantly surprised how much of the stuff I use (my electronics hobby for one) has Linux versions or alternatives available, and for the edge cases I have some windows VM's available if needed.
I was always going to make the switch when stuff like Firefox and Thunderbird stopped being updated on w10 (they are there by default on Linux Mint which is my choice) , and my Micro$lop Office stuff is on a w7 VM with office 2010 anyway but this has just accelerated the process. As a bonus Linux is less of a resource hog so I don't need any new hardware for a while!
thats where migration from windows to linux broke for upgrading :-) last december (2025) of my spouse PC. evolution got a bug it could not handle import of office 2013 pst files with special, considered invalid, characters folder names while the PST just used default office folders. The only reason to stop the migration. it crashed and needed to reinstall linux. they fixed the bug in a line or two, but only solution was compile evolution from source because there are no snaps, debs or other installers or git tagged binary builds for the latest version and that building is a lot of time consuming complex work.
Zorin and Mint are the best distros for "Windows Refugees".
People who DO NOT want to leave Windows, but circumstances force them to leave Windows.
People who DO NOT want to use Linux, but circumstances force them to use Linux.
Asking those people to use fedora or ubuntu will lead to frustration and lower productivity. I've seen it with my own eyes.
That's why, whenever I prepare a machine for donation, I go with Zorin CE if possible, mint otherwise. Mageia or AntiX for 32bit only machines depending on power.
"But virtually all other Linux Distros will do the job just as well."
Zorin & Mint are supposed to be the most Windows victim friendly so that's what I quoted. Personally I think anything running KDE would be preferable as there are plenty of cosmetics to make it look like W10. The unfamiliarity would be sensible menus, the fact that updates appear to be broken because they install so quickly and the lack of advertising.
107%agree, but with a caveat. Give a windows refugee the distro as is OotB. Do not custo.ize the distro. Otherwise, as soon as a reinstall is needed, or an update/upgrade chages stuff, they will either be up shift creek without a paddle or co.e back to you for re-customization.
If they do customize it on their own, is up to them
That's why zorin and mint rule
Yup, recently ditched my last MS device a 2 yr old laptop and bundled ubuntu onto it.
Replacing Windows was perhaps the easiest part (but still not simple for the non-tech minded).
Getting your slew of apps (or foss alternatives) to run - that required getting a little deeper into the nerdpool.
Hopefully the fabs will turn their capacity over to the kind of RAM that consumers can use, because that's not what they are currently making for the AI companies.
I would guess that the winner eats all and the remains of the AI companies that fall will be scooped up by the behemoths that are still standing.
Another visitor from a nearby galaxy, I see !!! <jk>
Welcome to planet Earth ... us earthlings have long 'understood' that prices go up BUT never (or very very slowly) come down.
(Especially, in the UK AKA Brexitland !!!)
As far as 'hope to see devices getting far cheaper' is concerned ... keep praying !!!
I don't understand how the 'AI' Tech Bros are thinking, most people are finding it difficult to get kit at a reasonable price BUT these are the same people who will need the 'NEW' kit to take advantage of the 'wonderful' 'AI' !!!
'AI' impacts us all whether we USE it or NOT !!!
It is a perfect opportunity for someone to create a well written bloatware-free OS that look like Windows and runs well on low-spec kit BECAUSE it does not include all the spy-ware/ad-ware/'AI'/tracking etc extras !!!
I wonder what could be available that is almost there !!!???
It mostly comes down to presentaton and a will to actually 'do it' !!!
:)
us earthlings have long 'understood' that prices go up BUT never (or very very slowly) come down
That's not true for DRAM at all. Unless you have recently arrived on Earth you know that DRAM prices have been falling massively for decades, interrupted by brief spikes for a year or two when "something happens" that causes a spike in demand. That spike causes them to shoot up, then prices fall like a rock when the shortages end and memory OEMs are left with more production capacity than there is demand.
This is the mother of all memory bubbles, so prices are going to crash harder than ever before when it ends. But not all the way back to where they were 6-9 months ago before this all started, that will take time because people who want a new PC or memory upgrade or whatever are starting to sit on the sidelines. They'll move back into the market when prices crash so that will act as a support for a while but it is 100% guaranteed that DRAM will someday cost less than it did at the bottom of the market last summer. We just don't know when that someday is.
AI uses HBM memory. That CAN NOT be socketed. So is always soldered. So it goes to the scrap pile.
HBM production lines can be retooled to make GDDR, DDR or LP-DDR but that costs money AND the line is out for weeks or months, so sometimes manufactuers opt to mothball the lines instead.
Datacenters use Full Buffered ECC - DIMMs. Even if the mem controller of your processor support it, you need special mobos to use it.
Also, there are modules that allow you to take your older FB-ECC-DIMMs and use them as CLX memory in more advaced servers that would not take the older modules directly.
Those two factors combined mean that: do not expect a flod of FB-ECC-DIMMs on the used market either.
Datacenters do not use normal DIMMs, SO-DIMMs or CAMM2. So again, nothing of that sort flooding the used market.
Come 2028, the only thing we may expect to have in the second hand market for consumers comming out of the datacenters is the future (but inminent) SO-CAMM2 LP-DDR modules...
> AI uses HBM memory. That CAN NOT be socketed.
Indeed, but why exactly? Is it because of the wide (128-bit) data-bus, or because of impedance changes at connectors causing reflections in high-speed signals? If so, could it work at a lower speed, or with a fancier connector?
Would it be impossible to take some post-bubble surplus HBM chips and solder them onto a pluggable module (not necessarily compatible with DIMM) and invent a new PC form-factor?
probably the impedance thing combined with a low power disipation per pin need (driving a socketed HMB produces lots of heat because of the sheer size of the bus concentrated in such small space
Also, there is no mem controller prepared to use socketed HBM.
Something similar happened with LP-DDR, from it's inception it was designed to be soldered. It took YEARS for some bright sparks at Dell to come up with CAMM that opened the door to socketed LP-DDR.
The timings are too aggressive and the signalling(?) voltage is too low (0.7V-0.9V for VDDQ) for HBM4, to reasonably make it work over a connector - but maybe it could be done with a fancy connector and sub-optimal clocks.
If you have a huge pile of the same HBM chips, then maybe it would be worthwhile to prepare a PCB for a SoC that uses HBM (that may be a PCIe RAMdisk for example).
With FGPAs and the right voltage converters and hundreds to thousands of hours wasted, maybe you could design a DIMM-compatible module, with heavy latency for the translation.
I can see a market for this stuff on PCIe cards as super fast SSDs - if there is enough stock floating about unloved the cost of a ASIC to run the show would be covered by the volume - but more likley there would be more profit margin sticking it in consumer graphics cards (like they used to).
HBM production lines can be retooled to make GDDR, DDR or LP-DDR but that costs money AND the line is out for weeks or months, so sometimes manufactuers opt to mothball the lines instead.
Zero HBM production lines will be mothballed. They only do that to lines using older lithography that can't compete economically anymore even when fully depreciated. They are ONLY using leading edge technology to produce HBM dies, they will be converted to DDR/LPDDR production when the AI bubble bursts. It is lines producing stuff like DDR4, LPDDR4, and and older that would potentially get mothballed (or upgraded)
The line is only out of production a very short time - a day or two if that - to switch what's being made, but it takes a few months to start outputting the new stuff because blank wafers at the start of the process take a couple months to turn into fully finished dies at the end of the process. They can't turn halfway finished HBM wafers into LPDDR wafers, those WIP wafers would have to be either finished as HBM wafers or scrapped.
Question for the smart bods around here; Has anyone actually found any desktop software that uses the NPU yet?
We've got a lot of PC's bought in the last year that have NPU's; not bought for AI, but just because they come built in to the standard business Intel Ultra5/Ultra7 spec - I've never once seen a single percentage use of the NPU in task manager; I'm really surprised I've not seen apps using it for a co-processor or hardware acceleration, like GPU Compute units;
Here in 2026, is there any practical uses for the NPU, especially outside of AI Model work?
Not really desktop, but transcoding video will use the GPUs for hardware encoding if possible. Local inference for things like speech-to-text has been using the same feature on the more customisable SoCs of mobile phones for years now and that's really what's in most of the "AI" shit on phones.
LLMs really are about size, so the real benefit only comes when you've got enough memory to run the models locally. But basically, most of the time most of those units are going to be doing nothing.
And that's the core problem of "general purpose "AI"".
It's either gonna suck at everything, need the whole worlds energy output to function (while completely ignoring the actual hardware itself required for it to run, not to mention hundreds if not millenia of years of data scientists carefully crafting the software), or a cocktail of both.
Which once again loops back to the fact that when it's just called LLM and is carefully crafted for a hyperspecific purpose, it's actually great and a boon to society.
But the current hype-train named "AI" which has only a bunch of techbro grifters on-board as engineers, is the exact opposite of that.
Icon for said grifters yelling "ALL ABOARD" to the smoothbrain investors.
...because it would seem to me that their behaviour is going to be the thing to watch here. Are they increasing capacity? Clearly demand is up, so production capacity should go up to meet it if this is a long term trend - who doesn't want to sell more product to more customers and make more money after all.
If they're not, and as far as I can tell they aren't, it says something about how long they expect this surge in demand to last. It's going to take quite a few years to get a new chip fab up and off the ground, and cost a huge amount of money. If the memory vendors aren't investing in those I can only assume that they don't think that the demand will still be there by the time those new fabs would come online.
We all know that cutting edge IT kit has quite a short life cycle - especially under the kind of extreme stress you find in big data centres - so it would seem to me that if no one is investing in massive amounts more manufacturing capacity then they don't see a refresh cycle coming in 5 years or so... which is odd, because that's been a given for decades. Kit lasts 5 years. You replace it with newer kit that's more powerful / more efficient / both delete as appropriate.
"It's going to take quite a few years to get a new chip fab up and off the ground, and cost a huge amount of money. If the memory vendors aren't investing in those I can only assume that they don't think that the demand will still be there by the time those new fabs would come online."
TFA states that the "surge may last until late 2027". Which would make your assumption seem pretty spot on. None of this ram is actually being used of course, it's all sat on motherboards waiting on gpus that will never be plugged in.
They are investing, but not massive amounts.
Actualy, some of them are disguising investments decided and/or done before this cruch as reaction to the crunch and comingling (is that a proper verb?) with investment directly related to this bubble.
So yes, the actions of the big three memory manucaturers point to a 2 - 3 year bubble of memory demand.
But this isn't like any of the others. Every time this has happened before it's been due to a sudden unexpected cut in supply - natural disasters, COVID, that sort of thing. The supply here has stayed pretty constant, it's just been totally redirected away from things people can actually buy into something very few people actually seem to want; namely massive rack servers to go in "AI" data centres.
So if the demand for memory from those data centres is the "new normal" so to speak, vendors should be investing in extra capacity. But they're not. So clearly they don't think that these "AI" bit-barns will still be generating demand in 5 years time. Why invest in the production capacity if you think the people who are currently driving up demand are all going to go bust and not be able to afford your product by the time all that extra capacity comes on line?
Feels like someone has made a very calculated decision on the near-term future of "AI" and decided they want to bet billions in additional business costs on it that may never see a return - which is refreshing, because it seems like everyone else can't throw money at this shit fast enough.
There's a LOT of ignorance/complete stupidity in these comments.
Take TSMC for example. They spent $197bn on capex and research from 2020-2024 to get from 5nm to 2nm and are spending $55bn in capex this year alone. By the end of the decade they'll have spent over $400bn, possibly even nudging $500bn on capex.
Source? El reg's sister site : https://www.nextplatform.com/compute/2026/01/16/tsmc-has-no-choice-but-to-trust-the-sunny-ai-forecasts-of-its-customers/4092173
That's ONE company. Add the big three memory manufacturers' capex together and it'll be well over a trillion dollars capex from 2020-2030.
But of course the commentards think "vendors should be investing in extra capacity. But they're not." - ignorance or stupidity, the readers can decide....
> So clearly they don't think that these "AI" bit-barns will still be generating demand in 5 years time
They do invest continuously otherwise they would be left behind as technology advances, however as it has already been said it takes *years* to plan and build a new fab and that means that there is currently a lag in supply.
The AI bubble will burst eventually and things stabilise, the AI companies are not making any profits and power supply constraints are going to put a big crimp on these planned mega-barns. I suspect most will gather dust.
I suspect that apart from making hay while the sun shines, they'll be waiting for the bubble to burst just like the rest of us. This is an AI arms race and not all competitors are expected to survive. Once the bubble bursts, those that do will be freer to set less aggressive timetables.
I wouldn't rule out state intervention in the form of export controls. And we might see a rise in new suppliers from China as their manufacturing capabilities continue to improve.
To answer the question in the post Title:
The Memory manufacturers are doing what anyone would do when seeing prices massively increase on 'hard to get' product & you are the one making the 'hard to get' product !!!
Sell everything you have in-stock and yet to be produced in the short to mid-term for the most you can get !!!
Other customers will wait because they have no choice and no one manufacturer will suffer from the 'bad press' because everyone is doing the same.
Once the bubble is burst the stock that has been sold BUT not used will find itself being re-sold elsewhere for whatever the customer can get.
Short term you should NOT overpay for what you need as the supplychain will return to normal fairly soon.
Perhaps it will encourage some 'old' ideas to be rediscovered regarding maximising the usage of scarce computing resources.
i.e. Debloating the OS and the attendant s/w it runs to use the available capacity/capabilities better ... less adware/'AI'/spyware/telemetry & general computing that is NOT for the support of the actual primary function of the company that makes the 'widgets or doohickeys' whose sale pays everyones wages.
:)
I see this as a good thing.
If nobody is buying the stupidly price increased new Google Pixel/Samsung/Apple shiny phone then Google/Samsung/Apple might rethink their AI demand (as it's them that have demanded these AI models and data centres...).
Maybe this is the start of the AI bubble burst.
Capitalism, the most efficient way to ensure that limited resources go to the most effective use - is forcing companies to build expensive ram for an AI that nobody wants and causing shortages for products that people are clamouring to buy.
Are we sure we read the right textbooks ?
Last August I purchased a NUC style mini pc with a Ryzen 9 processor, 1 tb of storage, and 32 Gb of DDR5 ram for $399. The equivalent machine now costs $640. Looking on Amazon, the memory alone is close to what I paid for the entire machine. Also, many of the newer models are listed with 24 Gb instead of 32.
I guess if it dies I can recover my cost by just selling the memory sticks.
Budget PC manufacturers would do well to simply put less expensive RAM* in the box and ship it with a less gluttonous operating system. I seem to remember this was tried a few years ago with sufficient success that Microsoft had to stay the execution of Windows XP rather than risk consumers realising that Linux could do the job perfectly well at zero cost.
-A.
* So long as it is user-upgradeable, of course.
> stay the execution of Windows XP
My XP machine runs happy with 1Meg, and not all of that. W7 has more frills but doesn't really DO more, and there are non-MS softwares (editors, file managers....) that do more than MS's hacks.
10/11 are not as bad as past history suggests.
MS could solve this "RAM shortage" with a comprehensive patch refactoring most of the post-2010 code.... but clearly MS doesn't have the personnel or inclination to do that.
Ah, but I've lived through 4 or 5 "RAM Shortages" in my career. Not buying another 256MEG in my XT until prices came down. Pushing Win7 on 4GB until the thrashing got bad (and I'm working in 4.3GB most of the day most days).
Less ammount of RAM is one thing. Less expensive RAM is another.
Each (LP-)DRR4 chip is less Expensive than a same size (LP-)DDR5, and (LP-)DRR3 chips of the same size are even chepaer, but you can not go from one to the other without changing the mobo (and sometimes the processor too)
Meanwhile, each chip of DDR-5 cost the same, no matter if you put it in one 8GB (SO-)DIMM or one 16GB (SO-DIMM), but buying a machine with 8GB will cost less. You can go from 8 to 16GB DDR-5 without changing the Mobo or processor.
Us engineers use generic terms like 'memory' in conversation but when we're actually working then there are all sorts of memory types, part sizes, speeds and package styles. The memory that's used by high performance computing is a large and lucrative segment but it still only represents one slice of the overall market. Although the demand for these parts is huge at the moment thanks to AI etc. its just not possible to ramp up production using the equipment using equipment used for more generic parts.
So, I rather suspect that what we're seeing here is less about excess demand from AI vendors and more about a good old fashioned cornering of the market. Its been done many times before but its usually in some commodity like a precious metal.
FWIW -- We had a similar memory shortage problem in the mid to late 1980s after PC manufacturing ramped up. Early DRAM parts were typically 4K*1 or 16K*1 DIP packages which were in such short supply at one point that there was quite a black market for them, new or used.
Maybe it's time for software developers to focus on writing native compiled apps, instead of these modern huge inefficient apps.
I'm looking at Teams, written in Javascript/Elektron and requires Edge Webview just to run. It does the same as old chat/talk apps that used worked in 8MB of RAM, not 1000 MB.
Mobile apps aren't much better. WTF is the GMail app for Android 300MB?
The beginning of the end of the personal computing era, PCs of all kinds are becoming luxuries out of the reach of the average consumer, with the only alternative now being cloud computing via glorified terminals. It's no coincidence that both Microslop and Nvidia are heavily invested in cloud gaming, and centralized AI cloud dependency.
the writing is on the wall and the elephant in the room. microsoft CPC, windows 365 is coming and memory wont be needed, as SSDs and any AIC. Such features will only be reserved for cloud streaming servers. 99% of user will stream windows just like an STB. GAME OVER for PC or any high performance hardware at home. i dont like it, but it will happen. Linux will have no hardware to run on. i was happy to experience the birth of the PC and likely will experience its death. Call gen X the PC generation.