
If they were
so concerned about e.waste, perhaps they'd like to whisper in m$'s ear about the requirements for windows 11 which will lead to the dumping of lots of PCs..........
Intel claims a more modular approach to PC design could make systems easier to repair and reduce electronic waste – and it has some proposals for you. In a blog posting authored by a trio of Intel execs, the x86 behemoth makes the case that the current monolithic approach to building devices leads to premature disposal, …
Or look at their own ever changing socket requirements for CPUs. This whole thing reeks of Intels regret at pulling out of the system board market and looking for a way that makes it more palatable for users to upgrade the CPU module without having to replace the entire system board due to CPU socket and RAM requirements. I'm willing to bet that the "centre" board with CPU and RAM will cost just as much as current full system board does now but be marketed as "cheaper upgrades" and Intel change the specs even more often. After all, it's Intel. They aren't doing this to reduce sales.
Or look at their own ever changing socket requirements for CPUs.
Absolutely. AMD have set a standard there with AM4 - came to market 2017, and (with a BIOS update), many chipsets will support Zen 3 processors released as recently as October 2024 (e.g. Ryzen 5 6600T). For businesses cascading old machines, it's a nice little life-extension upgrade which reuses most other components.
Meanwhile Intel make a socket last two generations and businesses are keeping parts for a fleet of incompatible systems.
Absolutely this. The AM4 system I built back in 2019* was recently upgraded with a processor several generations down the line from the original, and it's happily flying along. Not top of the range, but completely respectable and more than enough performance for using it as my daily work machine. I have absolutely no need replace it. Modular PCs have been a thing since well before I swapped out my 286-based motherboard for a DX2-66 around 1994, and my own PC has never stopped being modular.
*Well, I say built, I suppose you could say it's the same PC of Theseus I've had for about three decades: this was just the last time I changed the motherboard.
Same here. Also built a AM4 system back in Aug 2019, with a 3800X (was on offer, so same price as the 3700X at the time, so why not), plus a hand-me-down RTX 2080 which was in a old i7 system (from the days before i9 was a thing).
Popped a RX 6900 XT into this (again it was on offer) in July 2022, then a 5800X3D into it back in Sep 2022. So for CPU, it's basically as fast as it's going to get for gaming, and I don't need 16 cores for productivity on this machine.
I'd like to swap the GFX out at some point, but I'm not in a rush, and have no currently plans to switch over to AM5, as I can still play at Ultra or High settings, and maintain typically 100+ FPS anyway for the games I play.
It's so much easier to unplug the CPU and EPROMS, occasionally replacing a tantalum capacitors too on S100 boards ... the quick fix is normally swap the EPROM and replace the board, then repair the nonworking board with a little soldering. The only complex issue with the old boards is to make sure you plug the component in the right way around. Originally everything was created to be kept working easily after initially being designed to work.
Years ago I was told that a PDP-11 shipped to Brazil was not working so I was flown from the US to Salvador in Brazil with the new CPU board to fix it, when I pulled the original board out I saw that the CPU had one pin loose so I pulled the CPU out and straightened the pins, plugged it in and everything worked. And the people in the lab took me out to dinner every day for a week while we verified it was working well.
Lovely eating Brazilian local food, a big dish of fish soup which enabled us to see the fish at the bottom of the dish when we drank all the soup together - it was great!
This post has been deleted by its author
This post has been deleted by its author
My first PC was an Elonex 386SX 16, the motherboard plugged into another board which had the IO and expansion cards -sounds good, except when I come to upgrade it there was no upgrades and as VESA for the video cards was starting to become available probably not so good. I replaced it with an Elonex 486DX 33 with the cpu on a daughter card, later replacing the CPU with a 486DX 66. The 66 came with a fan and my recollection was it was a bit tight where the card went for fan. (I think the cooler on my PI5 is bigger).
I think it shows the issue in considering "up-gradability", things change in ways you don't expect making the upgrades not worth it.
But that was at the start, things were changing rapidly. It's all calmed down a bit now.
Nowadays it's common to be using PCs that are many years old, and standards such as PCI-E and USB have matured in such a way they're fairly backwards compatible.
My home server is a 2017 vintage machine, and it would take both a 2004 PCI-E GPU and a 2025 GPU without any issues. A 1977 vintage machine would look at an expansion card from 1985 like it was from outer space and hardware from 1964 like it was from the stone age.
I bought myself a new work machine recently, it was a second hand HP affair from a charity shop. Has a 10th gen i5 somethingorother in it. Popped some more memory in and a bit better storage and what was a castoff is doing great service as my work PC, and, barring hardware failure, will continue to do so for many years to come.
I suppose the only realy tragidy was that a 4 year old perfectly servicable machine was in a charity shop, I mean it's even W11 compatible so that's not even an 'excuse'.
I'm not convinced making the machine any more 'modular' would mitigate that though? It's an HP business machine so is already easy to work on.
But have things slowed down? If anything I think they have speed up, what has changed is sometime ago hardware reached the point where it is no longer the cause of performance frustration point in use which is what would trigger desire to upgrade.
Yes, definitely. An Ivy Bridge i7 3700 from 12 years ago is still a very usable computer for many use cases, except maybe games and video editing.
10 years ago, a 12 year-old computer would be utterly obsolete.
We are now at the point where most people replace their computers when they break rather than because they want new features. Phones are probably at that point as well now.
Yeah I was about to invoke the name Elonex, they did modular 286 machines and ejectable hard disk cartridges as well IIRC, but as you say, the upgrades were eye wateringly expensive and were limited as expansion bus technology changed because the backplane was ISA.
My desktop PC is already modular, with PSU, MB, CPU, graphics, storage, RAM all easily replaced (and so was the one before, and the one before, et al). Even if a subsystem or socket went bad on the MB there's PCI add in options that would rectify many possible failings.
So what exactly is it that the brains of Intel think they've invented?
I have to assume they're talking about everything-on-the-motherboard, like USB, SATA, Ethernet, WiFi and that they've reinvented the card slot (again)
Personally, I think they're right. Why does my MB have 3 nice SATA outputs and 3 shitty ones? AFAICT I can't disable the 3 shitty ones and put in a SATA card w/o disabling the 3 good ones as well. Fuck you, ASUS.
> 5 liter desktop chassis.
WTAF? When are PC sizes in LITERS (or GALLONS this side of the pond...)
WTAF? When are PC sizes in LITERS
Usually it's when talking about very small cases (and 5 litres seems to be the threshold for 'very small'). You could list off height/width/length, but volume is just a single number that says the same thing.
And as an American, I'm aware of enough history to be aware that the metric system was invented in 1799, roughly 2000 years after Archimedes' brilliant discovery. And PCs were invented roughly 150-175 years after that, depending on your definition of PC. So your assertion that PCs have been measured in liters since the days of Archimedes seems a bit absurd. Nice try, though.
I can remember seeing PC sizes given as LxHxW but never in litres - I need to know if it will fit in the space available on my desk, not how long it would take to pour it through a pipe.
And what's with insulting America because someone asked a perfectly reasonable question - if anything, suggesting PCs have been around since the days of Ancient Greece, and that Archimedes worked in litres, suggests the problem might not lie with the Left-Pondians in this instance... (or was that supposed to be funny?)
> with PSU, MB, CPU, graphics, storage, RAM
A few paragraphs in, the article discusses breaking the MB into separate IO boards. Maybe then you need to replace the central MB for the CPU upgrade or ram type upgrade, but keep the separate IO boards for the USB3/4 and ethernet ports that they already have, perhaps another board for the SATA controller and ports, etc.
Mostly we'll have to wait and see.
Back when CPU speeds doubled every couple of years it would have been nice to replace the main board of a laptop and retain the display, keyboard, battery and power brick. These days a cheap CPU is fine and if I need something fast I use ssh. I replace the keyboard every year or three and the display about half as often. Eventually you cannot get a keyboard to fit or a display with the right shaped connector. By that time I found the case had become brittle and cracked.
The obvious weak point of a laptop is the external connectors. Being able to replace those independently while retaining the main board would be useful. Standards for battery shape, keyboards and displays would be nice too. I am exactly the wrong type of customer for Intel: I use Linux so I do not have to replace the CPU at Microsoft's command.
One of my desktop PCs is pretty modular - A 2GB Raspberry Pi5; Passive metal case (Flirc); Apple power supply; microHDM/HDMI cable; and a 64MB microSD card. The most expensive thing to replace would be the Pi at £39/AU$85. For light web use it works surprisingly well.
The "other" desktop is not so modular - an iMac-M3.
Well, the big OEM's are already shifting away from standards on their desktop computers. The power supply is a proprietary design, not a common standard. I had a friends whose Lenovo gaming desktop motherboard went bad. It was cheaper to buy an Asrock motherboard. But the Lenovo motherboard used a non-standard cooling fan and non-standard connectors for the power button -- fortunately it did use a standard ATX power supply and microATX motherboard design.
I think Intel could a long ways reducing e-waste simply by requiring OEM's to conform to standards, instead of proprietary components. And supporting a socket for more than 2 designs. (Look to AMD for that. AM4 lasted as long as DDR4 was being used. I don't expect AM5 to replaced until DDR6 comes out.)
> Well, the big OEM's are already shifting away from standards on their desktop computers. The power supply is a proprietary design, not a common standard.
Actually contrary! The energy efficiency requirements pushed to let the power supply ONLY deliver 12V, and the mainboard generates 5V for SATA/NVME and so on. Saves a few parts on doubling power regulators and improves efficiency. Called ATX12VO. But it is indeed mostly used within OEMs, Fujitsu was the first to adapt this > 10 years ago. Consumer boards with ATX12VO are rare and target a specific niche.
Forgive me for stating the obvious here.
Many PCs are already modular, you unplug bits and replace them. Yes there are issues where things are soldered but that has been driven by the laptop market where everything had to be the thickness of CD case. Sockets take up vertical space (thickness) and board space as the carrier is larger than the chip or SMD ram.
Is this not just reinventing the wheel with a load of marketing bull and forcing yet another set of standard. As people have said PCIe has bee stable and backwards compatible for a long time now.
Memory is an issue for Intel (no idea on AMD) as each new chipset appears to change the requirement and yet another new DIMM socket format is needed.
If you go to the HP commercial products they are pretty much modular already. I supposed you could have a backplane that everything plugs into but I am not sure it will save that much over replacing a system board as all the other costs have now gone up to make them pluggable.
Typical office computers, whether desktop or laptop, only need the modularity they already have: Increase RAM and storage.
Typical workstation computers add GFX card, more storage (options), maybe a second LAN and one other more special card. A rare possibility is to add a second CPU, but I've never seen one doing that upgrade after buying. Apart from that: Upgrade RAM, storage and GPU already possible.
Typical home computer are the same as office computers mentioned in the first line.
Typical enthusiast computer are those which actually make full use of the modularity they already have, the intel module concept would be a step backward.
Typical server are those which actually make full use of the modularity they already have. Adding a second CPU happens more often there, but is still rare.
Intel is making something different for the sake of making it different - you cannot upgrade a "module" and still expect the full increased performance of that new module if the backplane cannot keep up with it.
Instead of the module concept Intel should go the AMD way: Making a socket last several CPU generations.
Personally, I use I/O and storage. I occasionally need to replace M/B because the USB has failed, and I often have desktop boxes I can't use because they don't have enough slots, and I've thrown out hardware for WiFi incompatibility,.
I would welcome a return to designs with more modularity.
Stop buying crap made by Apple or their wannabes.
A desktop PC is perfectly modular and has been since 1981. A laptop PC should have upgradeable memory & storage.
But if you obsess over thin, light and shiny then you get a doorstop with no ports that can't be upgraded. But it has a nice Apple logo huh.
To be fair to the previous commenter more and more laptop brands now have at least partially soldered RAM - Lenovo for example mostly now only have one upgradeable RAM slot on their thinkpad line, with 4/8GB being soldered to the motherboard in place of the second slot.
I dread to think what their consumer lines look like in terms of upgradeability.
That less than 10% of PCs are EVER opened up. I can only imagine that percentage is lower today.
The average person is not going to open their PC's case and repair it. Theoretically it could be repaired by someone else, but if you have to pay for it it is almost never going to be worth it versus just buying a new one.
Intel seems to be solving a problem that doesn't exist, or certainly one that the average consumer doesn't care about. That's like if Ford said they're going to make replacing the transmission easier.
I completely agree, the average consumer is extremely unlikely to open their PC to make changes. It may kickstart a small industry of "specialists" who will upgrade it for them instead of buying a new one. Parts + Labour will be on par with buying a new computer of course. Quite possibly a lucrative extra sideline for the mobile phone repair shops that infest out city and town centres.
Are they still around? Locally, the vast majority have gone now with none at all in my immediate area where there used to be four. My reference is a major conurbation in the UK. I'm guessing your reference to "mom and pop" means you are in the USA so might have a different experience. From what I gather, price of renting retail space is wildly different, being far more expensive in the UK.
That is why I switched to a suitcase with a raspberry pi, display, keyboard and battery/charger attached. It is getting easier to do as you do not have to fiddle so much to get the display powered and converted to HDMI. The other big advantage is instead of being thin and fragile my laptop is a sturdy mugger bashing tool.
I still remember the hobbyist, who brought his new PC back for warranty as the CD-Writer wasn't working (Some reason the requested CD-Writer wasn't installed\included on the order, so we shipped him the desired HP one with cables & screws).
When I opened it up he'd used 1" wood screws to mount the drive.
The twats who came up with this idea probably weren’t even born then... those who are ignorant of history are doomed to repeat it.
As everyone here is saying, there is nothing new or innovative about repairable or upgradable computers - the problems are the reluctance of the mass market to pay for it, and the pace of change of the technologies making the upgrades needing to have a much longer lifespan than companies like Intel’s willingness to support them.
The only exceptions in the desktop and laptop markets (apart from the gaming niche) have always been the RAM and storage because it has always been economically feasible to extend the useful life of PCs by upgrading either/both of them - until recently.
Then along came Asus with the Eee and Apple with the Air, with soldered RAM - one for cheapness, the other for thinness. Customers were too short-sighted, or more likely ignorant, to see the downsides, so more manufacturers followed suit.
I worked at a repair shop from 2007-2014 and watched laptops go from;
> socketed CPUs and RAM
> removeable batteries
> add in GPUs
> removable storage
> removable wireless cards
to none of the above.
Motherboard replacements thus went up drastically in cost once all these were suddenly included in the price and customers would instead opt for a replacement device.
But that mostly helps repair rates, sure you could eek out some extra life with a bit more RAM and a HDD to SSD swap but that would only take you so far. I can't see how even a modular system would help pass a DDR or PCIE generation leap without even considering the ever growing need for BIOS/CPU updates to plug vulns or meet OS requirements etc.
I must have missed the part where desktop PC's stopped being modular though, at least the ones that are bigger than a cd case anyway.
It would be nice to see laptops go back to how they were, although adding an extra 2-3mm on my phone to accommodate a removable battery would be more welcome.
Socketing CPU does not make much sense for a laptop, especially when you have to meet cooling with the CPU type for efficiency. My argument is not very strong here since quite some laptops use the same board and cooling for a range of CPUs, but it is not invalid either since soldered CPUs use the board they are soldered to for cooling as well.
Making the Add-In GPU, if not included in the CPU itself, replaceable is expensive. The power then must be able to support that GPU. The cooling comes into question too. In the end: Not worth it, neither from manufacturing nor consumer point of view, therefore it vanished, or got transferred to external boxes.
For the rest it is a matter of which device you choose: Mine have socketed RAM, replaceable batteries (some even without opening the device, else only Philips screwdriver needed), replaceable storage, replaceable wireless card (which I even did for one, upgrading to the next wifi standard). Luckily there is a pushback to make a wider room for choices again if you have that on your priority list, both in EU and US.
>does not make much sense for a laptop, especially when you have to meet cooling with the CPU type for efficiency.
That's very trivial to solve - just don't put garbage cooling in the laptops, polish the contact surface as don't use the worst possible thermal paste.
If you go take apart a laptop, scrape off the insulation paste, polish the contact surface and put on the cheapest thermal paste, suddenly temps substantially drop.
You can do a hardware mode and put a Core 2 Extreme QX9300 in a GNUbooted T400, which was designed for 35W processors and the above cooling improvement and the temps will be fine (although there's a further cooling mod).
>Making the Add-In GPU, if not included in the CPU itself, replaceable is expensive
It really doesn't cost that much to put in a slot instead - it actually can cost less, as the manufacturer only needs to design and make one board and can choose what CPU, GPU and RAM for each model - but of course instead there was intentional MXM module incompatibility.
>replaceable batteries (some even without opening the device, else only Philips screwdriver needed)
Needing a screwdriver to swap a battery is insane.
>replaceable wireless card
That's default on most laptops, too bad some BIOS's are sabotaged and refused to boot up with a non-whitelisted card plugged in.
For real?
Q4 2018: New Mainboard: ASRock Taichi x470 Ultimate. New DDR4 2400 RAM with ECC 4*16 GB. Ryzen 2700x. RAM could be overclocked to DDR4 2800 before ECC errors appeared (> 1 month real world usage). GFX card was still my "old" Titan X (Maxwell).
Q3 2019: Ryzen 3900x. Same RAM from could be overclocked to 2933 now. I think I got my Titan RTX (aka RTX 2090 TI) back then too, having 24 GB video RAM was the reason. I created Port Royal 8k 120 fps on that card. 3DMark 2006 in 16 K too. I have never had a GFX card with such high overclocking reserves.
Q4 2020: Ryzen 5950x. Same RAM could go up to DDR4 3200, but rock solid (months of usage) settled it to 3066 - else I got ECC errors. I was planning on Ryzen 5900x, but that was not available. So I did the "first world problem within first world" and ordered the available 5950x - which I don't regret.
Christmas time 2020: Played Cyberpunk 2077 in 4k. On 65" curved. The first release version. Worked well with less bugs than usual Bethesda releases. Play style was "Corpo, female, sneaky sneaky. Main quest? never heard of it...". Only really annoying bug: I was never able to find Trevor, I think he fell out of the level. Ending was the "Panam big flying machine" variant.
Christmas time 2023: Replayed with Phantom Liberty. Playing style "street kid, female, fists and guns and blades blazing. There is no main quest". With the best ending aka "secret solo suicide run through Arasaka without anybody helping". Managed a "higher completeness" level too except I left Panam standing at the train mission since "WTF why should I do that stupid stuff she says?". Newer drivers and AMD framegen feature allowed a higher quality setting.
I was planning on replacing with a newer AMD, but since neither 7950x3d nor 9950x3d have extra cache on BOTH chiplets, which I could fully utilize with AV1 encoding, I haven't upgraded since 2020. Currently smells like "OK, then we go from DDR4 directly to DDR6" since this machine is still fast enough to make even Windows 11 24h2 feel fast, albeit not stable :D. Still the Titan RTX, fast enough for everything in 4K with raytracing - not always ultra, but definitely good enough.
Removable batteries and storage is still the norm.
The problem with socketed CPU and GPU is that they bring extra height to laptops. Which wasn't an issue until Apple came up with the sleek Macbook Air.
suddenly the PC companies had to come up with something similar to keep up.
I suppose the PC companies found out that extremely few people actually replaced the CPU so a socket wasn't needed anyway and soldering the processor into motherboard is both cheaper and has less defects.
Business doesn't want you buying a base spec machine and then fitting lots of RAM from a 3rd party - that will cost them profit!
Most business desktop PCs these days are just a laptop motherboard in a case. aka mini-PCs
You don't get a modular, desktop PC until you go to "Workstation" class machines
Just think if PCs went back to socketed CPU and RAM and peripherals that plug in - nobody would be buying a PC with TPUs fitted, and we all know how much users need a TPU on their machines :-)
/Rattus
The other issue is that with increasing RAM speeds comes the need for very stringent signal routing required between the CPU and RAM. That means the RAM has to be physically close to the CPU, which means SO-DIMM slots have to occupy the same volume as the CPU cooler. At the same time, CPU cooling is the biggest performance bottleneck with modern laptop CPUs, the same CPU will perform significantly differently in two different laptops, with different cooling solutions.
So while manufacturers can still have upgradable RAM, it means a lot of compromises when for the 10% (or less) of their customers who actually do upgrade their RAM.
CAMM is a different way of attaching memory which is designed to mitigate some of the issues with SO-DIMM slots, but it doesn't seem to have caught on yet.
Back to the days of S100. The motherboard was just a row of sockets with traces between them and a place at the end to attach the PSU which had a whacking great electrolytic balanced on its screw-on terminals which I'm sure wasn't designed to be fixed like that.
Everything was plugged in.
That was even before Elonex.
Wouldn't it have much more impact if they could design PCB's and other electronics to be easier to recycle? These days it's all manual labor, mostly done in Third-World nations under dreadful conditions.
I don't have any suggestions on how to do this. This is a really difficult problem.
I'm not in the recycling business, bu there's probably not much of value to recycle on a PCB anyway: the chips are mostly plastic and silicon, which are worthless. You'd have to strip down an enormous mountain of PCBs for the copper / gold / tin to become somewhat valuable. Maybe all the chips need to be socketed so that the working ones can at least be recovered and reused easily.
If you grind up circuit boards the resulting crud has a higher percentage of gold than the ore that's mined for gold extraction. The copper is just a bonus.
There's plenty of value in electronic waste, the hard bit is separating it down to useful materials without making an even bigger mess. You could use the same methods the gold mining industry uses but they are rather messy using either mercury or cyanide compounds.
Another problem is that manufacturers don't make it easy to dismantle stuff so along with the valuable boards there are worthless casings and other components like batteries and displays which need to be dealt with separately if you want an efficient process.
As for pulling ICs testing them and reselling the good ones, forget it - the cost would be prohibitive and nobody would want to buy them anyway. Better to grind them up for the gold on the internal contacts.
Wouldn't it have much more impact if they could design PCB's and other electronics to be easier to recycle? These days it's all manual labor, mostly done in Third-World nations under dreadful conditions.
The current fashion of gluing laptops together is notionally to make recycling easier - run it through a warm oven and the components more or less fall apart, with very little manual labour involved.
Of course, this has two issues:
1. Kneecaps the reduce-reuse-recycle process by preventing midlife upgrades and going straight to recycling (upgrades are bad for business - my Mum is using a 2012 Macbook, with an SSD and RAM upgrade that I did about 5 years ago. Basically any CPU from the last 15 years is easily sufficient for basic email/browsing - they're all "good enough". I suspect Apple would much prefer that she updated on a 5 year cycle).
2. Not many laptops actually make it into a recycling channel to start with.
Recycling the PCBs once you've disassembled the device is of course a tricky problem. In theory you could probably get a lot of stuff off by running the board through a reflow oven and popping the components off. They in turn need sorting based on the chemistry (e.g. silicon vs capacitors).
I wonder if it'd be easier to just grind the whole board up into powder and chemically reclaim the constituents.
> putting together their first ever PC from components ....disk controllers, I/O, and even the display circuitry all came on separate plug-in cards.
Even before that. I know the 5150 PC team sat apart from the Mainframe gang, but IBM was *always* aware of service costs, repair and upgrade. Plug it! The 5-slot PC had several video, parallel and serial card choices, with some really significant price steps so you didn't just get all options. The 8-slot XT had more disk and later video choices. And yes many of us hacked PCs into XTs and XTs into pseudo ATs.
Intel must be really worried about the PC market to try and "invent" this again.
For a previous "exercise" I still have a couple of Intel Pentium II 2 CPU Processor SL2HD's kicking about. Totally useless of course after we found at the time.
The problem is all-in-one motherboards are cheap to make. A string of surface mount machines and a ready to install motherboard is ready to ship. Once you add daughterboards, connectors and linking cables you add a lot to the cost. If AMD decides against this folly then it would be game-over for Intel until they saw sense again.
I am reading this on a framework laptop which arrived as a kit much the same as part-baked bread from the supermarket; insofar that creating the final product was quick and painless. Quality is easily as good as the Dell XPS it replaced and the documentation is better.
Yeah, Intel doesn't seem to be terribly innovative here. I don't have a Framework, but I'm fairly sure I'll get one once my current laptop dies.
If Intel really wanted to do something I'd cheer them for, they'd make Framework modules. And hasn't Intel just started getting serious about GPUs? Would be nice to have more Framework options there.
You should be able to just swap out each and every major component (CPU, RAM, chipset, NIC etc) when it fails and do a quick solder job replacement of minor components, but it doesn't seem that Intel wants that.
Intel is known for not only rotating a pinout, but then swapping 2 pins to stop you from using a "LGA771" CPU in a "LGA775" motherboard, bizzare chipset limitations, no standard ECC when it was clearly needed, changing CPU sockets very often (you would think by now they would have standardized on a pin layout) and even making sound cards that run proprietary software (so you have no sound) and a digital signature on that software to stop you from fixing that sound card (Intel now handcuffs pretty much all of their hardware, from Wi-Fi cards, to dedicated GPUs, to CPUs, to sound cards, to SBCs), so it would really be foolish to believe their claims that they care about e-waste, when most stuff they offer is e-waste out of the factory (as you cannot fix the software when it breaks).
It seems that Intel rather wants to split computers into 3 parts that only works when paired with 2 other matching parts.
Intel obviously think this will lead to higher revenues for them. I'm not sure it will benefit the users much except a very small percentage of diy'ers. When you want to upgrade you generally need to upgarde a lot of components to get a decent performance improvement. It may take-off for a while as people determine if it works for them. Personally, I would hope the answer is fewer upgrades required. The only real reason for the general user to upgrade is to run AI locally and everything will be done to discourage that.
So, why are Intel pushing this? I don't believe altruism.
We've tried 'modular' before: Anyone remember ISA, VESA, PCI...? Back then, it was done because the MB didn't have I/O and so on embedded - but we kept changing the interface to keep up with technology and I imagine the same would happen today.
Simple solution: world-wide, you oblige the manufacturer to take back their product(s) and penalise them for failing to meet set standards in recycling the material.
While that would focus their minds immediately, problems created include the massive additional costs and impact of returning items, as well as the not trivial problem of getting the world to agree on the policy.
How many people actually read the article before rushing here to comment?
Yes, Framework are in this space. Yes PCs have been modular since day 1. However, the vast majority of laptops (which is the main thrust of the piece) are not modular and there is next to no ecosystem around that (bar the one high profile vendor). As I'm currently waiting on a replacement motherboard for a laptop, I'd be more than happy to see an industry move towards standardised replacement subsystems, particularly when the refresh cycle currently means that fixing a year old computer is stonewalled by a complete lack of spare parts ("but we can sell you this new model!").
I don't think most people haven't read the article. I think most of the comments are well justified.
First of all, Intel is not just talking laptops, they are also talking desktops. This is clear in the article. Claiming desktops with replaceable parts as an innovative idea is ridiculous. Pointing and laughing at that is a reasonable reaction.
Secondly, the existance of Framework is not something that should be simply ignored. They have been doing modular laptops for a while, and they are niche but not unknown. Intel's claim that modular laptops are some kind of groundbreaking innovation is weak at best. Commenters are right to point that out.
Also, the fact that Intel is utterly ignoring Framework, apparently pretending they don't even exist, is a bit of a problem. If they go ahead like that, we're going to get two distinct and incompatible ecosystems for modular laptops. That's worthy of discussion.
Desktop, tablet, phone. Laptops are a "one size fits nothing" because of the compromises.
Desktops are pretty much infinitely expandable, and much better, bigger displays, keyboards, and mice.
Tablets are lighter, better battery life, and cheaper. And they can also use cheap Bluetooth mice and keyboards And if you still something on the keyboard, just toss it.
Phones for the ultimate in portability, and as a hotspot for a tablet.
Laptops need to go away.
Except that laptops can run the more capable software that also runs on desktops, and they can be used with the same peripherals as desktops can. Tablets are usually encumbered with mobile operating systems and don't transition well or at all to larger peripherals. A tablet connected to a Bluetooth keyboard is not as useful as a laptop, but it is similarly sized and comes with a bonus that you have multiple batteries, either of which can cause difficulty if it runs flat.
>> Except that laptops can run the more capable software that also runs on desktops
Both my desktops have 128gb of ram. Try to find a laptop that can do that.
One has 6 4TB drives, the other has 5, + a 2TB nvme. Try to find a laptop with that.
The main desktop has 2 x 16gb gram video cards connected to 6 x 4096x2160@60hz big screen the (2x65", 4x50"), and runs flight simulator at 8192x2250. Try to find a laptop that can do that.
It's great for multitasking - every window can have its own screen. Try doing that with a laptop.
Email and web browsing is handled by a pair of tablets. Serious work (and serious play) is where the desktops come in. Along with proper ergonomic seating and full-size keyboards for 2 users on one of both computers at the same time via kvm switches.
Wannabes with their laptops in coffee shops wouldn't know what to do with that much compute.
And back on topic - I built those desktops (2022 and 2024) to last between 15 and 20 years. Maybe update the hard drives, maybe a pair of 32gb video cards in 5 years ... Because unlike laptops, they're modular.
My laptop has 32 GB of RAM. Try to find a tablet that can do that.
I can have multiple drives installed in something with a 13-inch screen. Find a tablet that can do that.
"It's great for multitasking - every window can have its own screen. Try doing that with a laptop.": Yes, my laptop actually does have ports through which I can connect it to multiple screens. It's got a GPU that can drive those screens.
Are you recognizing why laptops and tablets are very different, with the laptops often being better?
Incidentally, 128 GB of RAM or multiple high-end GPUs aren't impossible to put into a laptop. They're just expensive, but your desktop probably wasn't too cheap either. The laptops that can do that are intentionally modular. They are not what I use because I don't need that much graphics on the go. What I do need there is the ability to connect to many types of peripherals, run programs intended to run on desktop operating systems, and run virtual machines, all with the ability to pick it up and move to a different place. A laptop does it. A tablet doesn't.
This doesn't make desktops or tablets bad. A tablet might not do what I need, but there are plenty of cases where a big phone with all the software limitations of that phone is just fine for a user. Just as a tablet is the right device for something you do, a laptop is the right device for things I do and I'm far from unique in that.
I wonder if this is more to do with de-integrating the chipsets, great big slabs of silicon or multi die packages that have all the functions a system board offers can't be cheap to manufacture and yields drop the more complex a design gets.
It also makes me wonder if Intel might be preparing to offload or licence a bunch of IP without giving away the farm so they can concentrate on core (semi unintended pun) business of CPU/GPU and, maybe, programmable logic?
For all the crowing about reduce-reuse-recycle pretty much everything still goes into a WEEE skip, and while it's still fully functional and usable. Repairability won't really have a big impact on the waste volume because it ends up as 'waste' for other reasons.
If something is considered worth repairing it's usually still new enough to have the OEM spares (or it's worth finding/scavenging some) and another new standardised modular system won't change anything. Also bearing in mind that some things aren't repairable because the complexity & labour of *any* repair would be more than the cost of a whole new one.
I might hate the disposal of stuff that could be useful to someone but I don't really see this idea having any real impact on it.
Tons and tons of perfectly working powersupplies, power cables, fans, and I/O boards get sent to the scrap because of server updates.
Why is it not possible to open a 1U HP proliant G8, replace a few boards to upgrade it to a Proliant G10 ?.
I hope such ideas like Intel published will become mainstream one day, current practices are unnecessarily wasteful.
Ok, every part has been replaced at least twice, but only one part at a time so its always been the same PC. PC's are modular: applications, O/S, case, PSU, storage, monitor, keyboard/mouse, GPU and motherboard/CPU/memory.
The last 3 not always but normally get upgraded together as whilst I have upgraded just the CPU it is seldom cost effective. Similarly I have only once or twice added extra RAM. Normally memory technology advances mean its also time to change the motherboard and CPU.
My last upgrade of almost a year ago was the GPU and it will get a CPU/motherboard/memory upgrade by the end of summer. The case is the oldest bit: its my second since 2001 and I see no need to change it anytime soon.
If this doesn't demonstrate that PC's are modular I don't know what would.
In our family, computer cases have names, because the outsides remain the same while the insides change. I recently replaced the guts of Marmot, a rather nice Lian-Li aluminum case - the only visible difference is a USB-C jack on one of the 5 1/4" panels.
So I've always loved the idea of module lego brick style computers as a concept. Slab for compute, slab for storage, slab for specialised I/O, slab for graphics acceleration, click click clunk together a computer you've built.
Unfortunately everyone that tries to figure out this runs into the old problem of the interconnects need to be fast, and that gets difficult and compounded by some bright spark figuring out a faster interconnect that is more sensitive so doesn't work well with something that can be easily plugged and unplugged. There's a recent one I saw being advocated by some tech youtubers again recently, I can't recall who. But it looks like it was going to hit that problem darn quickly again.
However some incentive to force computer manufacturers to have less on one board with more daughter boards that can be replace\repaired would be nice.
I read Dell are putting usb-c connectors on removeable boards in the latest gen of laptops. Oh my won't that be good. The number of perfectly good laptop motherboards that have to be replaced for want of a borken USB-C charging port is silly. Port borked, just pop the case and couple of screws to fit a nice little daughter board sweet.
However, that does remind me of the hell I used to have as a field engineer working on the Toshiba laptops a large UK government department had a large fleet of in the early 2000s. Very repairable, and often failing. Lots of daughter broads and so much that could be component replaced. But oh my so many screws of different types and special order of part tear down. Used to take ages and you'd better hope nobody knocked the table you were working on and some of the screws got lost or the careful layout of what went where was disrupted. Fun times with little screw drivers.
> remember the DEC Rainbow?
I don't see the Apple][ mentioned. For most of us that was THE introduction to slot-cards. Because they got to be common before the IBM 5150, and because the lid just popped off, no screws (heavy Velcro(tm). I know IBM did not copy the Apple slot connector, but next best thing to it. https://apple-history.com/images/models/aII_open.jpg
If they gave a shit they wouldn't of caved to Microsoft when they stopped supporting Windows 7 with their newer chips. And as mentioned, have words with Microsoft about the forcing of TPM which is going to create a massive amount of e-waste. Because although people could replace Windows with Linux Mint, the average user isn't going to know how to do that.
"with the estimated annual economic monetary cost of e-waste reaching $37 billion"
On a trade of multiple trillions, or about the same as the british government spaffed all over Dido Harding
The simple reason it's not given high priority is that there's not enough money at stake (and paradoxically, "mining e-waste" frequently has lower concentrations of desireable commodities than raw ore)
Reducing eWaste Yes!
Extending liftimes of equipment purchases? Yes! Yes!
But what is needed is better repairability.
How much stuff gets discarded because the capictors or micro-switches can't be replaced ?
What intel should do is stand up a custom repair service that can fix the things that a normal human with a soldering iron can't do.
I have a thinkpad T43 and a T60 that I keep because the screen, keyboard and case are currently viable. But putting modern PC components around that, very involved with today's tech.
Serious guys? "Innovative"? Kaypro Computers was using completely module back-plane designs 40 years ago!! CPU, I/O, graphics, memory were all on separate daughter cards allowing to to upgrade seamlessly as much as you wanted. Granted backplane technology has gotten a lot better since then but it's not new!!