
"We'll know more about Intel's future when it gets a new CEO."
I think I can see Intels future already.
Slowly circling the drain...
In the dystopian world of Blade Runner, the killing of rogue superhuman replicants wasn't called that. Rather, they were "retired." Pat Gelsinger is far more human than many of the current crop of cyborg-lite tech execs, and the Intel board is no Harrison Ford, but they "retired" him just the same. When a CEO leaves their …
«I think I can see Intels future already.
Slowly circling the drain...»
Angular momentum being what it is, the drain circling will speed up becoming faster and faster until that final gargling sound as the last drop goes down the gurgler.
And at that point, we are going to have some problems. Sure, right now it doesn't seem like we're starved for good processors. AMD makes some pretty nice ones and the ones from Intel are, while more power hungry, still getting us some good benchmark figures at prices we can stand. Without meaningful competition, we have a lot to lose. The possibility that's often described is that ARM competes with AMD for desktop, RISC-V competes with ARM for mobile, and nobody needs Intel to be competitive. Maybe that will happen. I think it is more likely that we don't switch away from AMD64 fast enough that ARM and AMD are meaningful competitors in many areas. I also don't have the same hopes for RISC-V's meteoric ascendancy that some believe is around the corner. I think Intel's suffering will be ours too some day.
I've seen things you people wouldn't believe
Microcode bugs that will brick your CPU
I've seen CPU designs nanometers behind the competition
All things flogged by private equity, like tears in the rain
Time to end
That said, mild objection on the GPUs: it looks like if they actually stay the course there (ha! yeah, right) they might have something on their hands. GamerNexus only just reported that Intel's latest GPU is bang for buck - and with vastly improved drivers - the best at this moment. Even if just because of unadulterated greed on the side of AMD and especially nVidia.
mild objection on the GPUs
The current GPU effort is a fresh reboot that started from scratch in recent years. I suspect the point being that if they had kept the original effort going, they'd be further ahead now.
That said, I saw the GN presentation yesterday and will watch the upcoming benchmarks with interest. Intel's memory configuration supposedly helps against 3060 (and 3080?) in the new Indiana Jones game. It is nice to finally see some competition in the "low" budget tier again.
> they'd be further ahead now
I don't think so, from using the old designs. It always struck me as CPU designers saying "well, they told us to make some GPUs. how hard can it be?"
I'm glad they didn't follow the sunk cost fallacy for once and canned it, and are starting fresh with (apparently) actual GPU designers on staff.
We'll see if they make progress or can all the GPU designers in an investor-mandated layoff.
The original Intel integrated graphics were, literally, there because we had some unused real-estate on the CPU support chipset and our designers figured they could put a functional display engine into it.
I have happy memories of, year after year, meeting with a good friend at NVidia and having the same conversation…
Him: “so, how’s the new integrated graphics?”
Me (misplaced loyalty and even more misplaced faith in Intel’s marketing spin): “pretty good actually, competitive with nvidia and ati at the mid-range”
Him: (struggles to keep a polite straight face)
Me: “let’s have another beer and change the subject, eh?”
I lost the will to keep flogging that particular dead horse many years ago.
Anon cos I want to keep my Reg ID and my career history separate.
For a lot of manufacturers and users, if the integrated graphics can handle the basics, then that's one computer you don't have to make more expensive by putting an NVIDIA or AMD GPU in. You're probably correct about gamers, graphic designers, or all the other people who have dedicated GPUs. Still, I remember a lot more laptops coming with dedicated NVIDIA graphics chipsets in the 2000s than I see now because the basic GPUs are sufficient. I don't know how much of the market that was for NVIDIA. Judging from the demand for their processors for LLM training and cryptocurrency mining, they're fine. If you hadn't had that, would they care more?
And the Intel compiler ensured that they were great at CODECs, which is what most people need. Sadly, and I've no idea quite why, but my Intel-Mac doesn't seem to be able to use hardware acceleration for video calls on Google, Zoom or Teams which means the fan spins up pretty quickly and, if I'm not lucky, the network interface being used dies after about twenty minutes, especially on Teams. Still, this gives me a valid reason not to use the camera.
We always purchased desktops with add-on cards even though the machines came with integrated intel graphics. Intel graphics always seemed to have trouble getting the correct timings for external monitors. Never a problem with ati/amd or nvidia. There were other problems when we used laptops with docking stations and external monitors. Only recently has the integrated video seemed to work properly. Still with the laptops i have available no comparison to add on video cards. Business computers may not play games, but decent performance is needed with today's higher resolution monitors and the prevalence of video conferencing.
Cost efficiency seems to be AMD's edge, rather than performance. I think AMD will be stuck playing 2nd fiddle to nVidia in the consumer GPU market for some time but if AMD can get more people using ROCm or help them port CUDA code to it then the risk is of them eating nVidia's datacentre market. AMD's industrial cards are a lot cheaper than nVidia's but nVidia's spent over a decade cultivating the software ecosystem for GPU computing.
nVidia's also in a weird personnel position in that so many of its staff were given company shares in years gone by that the place is full of millionaires now.
"nVidia's also in a weird personnel position in that so many of its staff were given company shares in years gone by that the place is full of millionaires now."
If they haven't cashed in those millions are hypothetical until they do so they need to keep the share price up.
> nVidia's also in a weird personnel position in that so many of its staff were given company shares in years gone by that the place is full of millionaires now.
It might be 'weird', but it's not unique.
Several 70's/80's startups that made it big have been through the same thing. Microsoft is a prime example, there was a doco in the late 90's I think it was, and there was a woman in her 50's that was just boxing up Windows CDs/manuals in a production line who was a multi-millionaire because she'd been with Microsoft for nearly 20 years and her share options were worth millions, and this is a fairly menial job to have gained so many share options, imagine what high-level staff would have.
These are the 'exception' stories that make people work for fuck-all at startups hoping their startup is going to be one of these exceptions. Like where people say "Mark Zuckerberg dropped out of college and became a billionaire, so I'm going to dropout too". Sure, but the Zuckerbergs are a 1 in 10million dropouts, most of the rest are still waiting tables (obviously of course some did well if not quite hugely rich, but again the ones who did really well are the 1 in 100k dropouts).
Yeah sure would be great if there were top of the line AMD consumer cards with good ROCm support that could serve as a salespitch for their datacenter offerings. Not, uh, one card that wasn't actually supported until one year in and that didn't stop crashing the system until two.
But I'm sure they'll come up with something better any day-- what's that you say? They're leaving the top segment to NVidia entirely? Committed to no releases in 2025? Well okay then! I guess they don't have to win if they don't want to.
Nvidia had at least two Pentium 4/Bulldozer moments, in the mid 2000s and late to mid 2010's. In both cases, they were helped by the fact that ATI/AMD drivers were awful, and in the latter case by CUDA. Nowadays, their grip on the datacentre industry with CUDA would help them weather the fallout for a good long while.
I think past experience* shows a moment of correction will occur at some point in both tech and stock.
I am unconvinced that there is really a paying market for the AI cloudy bollox.
[*I am duty bound to state past experience is - of course - not a guide to future performance]
It's not the 1990s anymore.
I really think there's a notion that just because a company had massive success several decades ago they will always somehow continue to do so.
I doubt that some of these big names will disappear for a long, long time. But to expect them to be dominant players in their sector forever is naive. This is the same across most (all?) sectors.
Things change, and often those who can adapt rapidly and easily are not the massive organisations who you might expect.
As the article says, Gelsinger was a teenager when he joined Intel. 60-something with a huge pot of money is a great position at which to retire. What happens next is up to the next generation. Nobody can go on forever so why would he be seen as a natural successor for even the next 1 never mind 2+ decades?
Aren't Dell doing well? Admittedly they are a financial engineering firm that does a bit of corporate IT on the side but who is going to replace them?
Hand me an HP corporate laptop and I'm handing it back without lube. Lenovo is about to get a million% tariff and corporate aren't about to hit the local PC store with a copy of computer ahopper
Anyone who makes laptops/tech etc is going to have torrid issues - HP manufacture in Mexico, India, China and USA. Unsure what the mix is.
Dell are in China, Vietnam, Taiwan, Mexico, Brazil, Ireland, Malaysia, India, Poland, and Thailand so just as exposed, again unsure of the mix and how quickly they can pivot.
It’s gonna be pot luck with the lunatic Orange Jesus inbound.
Lenovo sponsors a number of sports, including:
FIFA
Lenovo is the official technology partner for the 2026 FIFA World Cup and the 2027 FIFA Women's World Cup. Lenovo will provide technology solutions to enhance the fan experience and broadcasts, including AI, devices, and data center infrastructure.
Formula 1
Lenovo is a global partner of Formula 1, providing technology devices, solutions, and services to support the delivery of Grands Prix. Lenovo is the title sponsor for two races per season, and Lenovo and Motorola have increased trackside branding at events. Lenovo also provides high-quality equipment to F1 staff, including laptops, workstations, desktop computers, monitors, tablets, and Motorola smartphones.
MotoGP
Lenovo is a co-sponsor of the Ducati Lenovo team in MotoGP. In 2020, Lenovo offered fans the chance to redesign the Ducati bike's sponsor logo through an online contest.
Carolina Hurricanes and NC State
Lenovo is the naming rights partner for the home arena of the NHL's Carolina Hurricanes and NC State.
Too true: sack the engineers and look for ways to manipulate the market to reduce competition. Intel's tick-tock approach was okay, but agreements with manufacturers meant they were able to take their eyes of the competition. And, sadly as with Boeing, et al., investors are all in favour of jam today.
Can't shed a single tear for Intel. For decades, *decades*, they've been nothing but lazy and incompetent, and even actively rejecting help. They are anti-innovation, progress is an allergen to them. If it weren't for AMD and ARM, we'd all be using nitrogen-cooled 9Ghz single core 32-bit CPUs with unholy memory extension hacks. Forget smartphones, those wouldn't even exist. Not without coming with a free tube of Cinco-Fone Cooling Gel for your poor, melting face. No wonder they dominated with Microsoft for so many years, they really are the hardware to Microsoft's incompetent software.
Hope this company burns (besides whatever parts of their lab their CPUs have already burned)
"...with unholy memory extension hacks." When I rolled off a Motorola 68000 assembler project to my first 8086 asm project, I remember explaining the 8086 to someone: "What if your car had instead of a single 20-gallon gas tank, twenty 1-gallon tanks and you had to manually manage fuel flow into/out of each gallon tank while driving?"
Reading the 8086 asm manual that fateful evening...I was seriously questioning my continued efforts towards a CompSci degree.
Large programs/data requirements in C on the 80386/486 wasn't any better unless you were able to use a DOS extender like Phar Lap (expensive). It ran programs in protected mode allowing for "real" 32 bit (no segment register crap) programming. DJGPP, a free variant of GCC with its own protected mode run time was a godsend and I used it on many professional and private projects. Thank you DJ Delorie for making my programing life so much easier way back then.
I recall reading in the documentation with Eric Isaacson's A86 shareware assembler the proceedings of an Intel meeting discussing the 8086 programming model - the whole class/group/segment, far/near shemozzle and the weird scenarios where various groups could be overlapped to permit "near" access to code or data. My thought was with clowns like these who needs a circus.
I vaguely recall A86 supported a simplified model which echoed Eric's point that Intel's classes were really groups and their groups were really segments and their segments were really pointless
If the Amiga wasn't so much more expensive than PC clones in my part of the world I would have opted for the 68000 option - the Atari ST wasn't readily available.
Random thought...
Maybe if Intel didn't make itanium and drive the alpha processors out of business, thus driving many(?) alpha engineers to AMD they wouldn't of come up with the amd64 instructions? Seems a lot of similar technologies in the amd design that came from alpha. I'm not an expert, maybe just coincidence....
An ex-DEC colleague made this same observation about DEC->AMD migration of engineers recently, so I think you're right.
However I see the end of Alpha as HP's doing, when they acquired Compaq (which had in turn acquired DEC) and came up with the Itanium wheeze which was supposed to be the next iteration on HP PA-RISC (in my simplified mental model).
"#1. A body remains at rest, or in motion at a constant speed in a straight line..." unless acted in by an external force.*
To be fair sir Isaac had to invent a force to explain why moons, planets etc keep circling each other. And a dodgy story about an apple.
* Corpus omne perseverare in statu suo quiescendi vel movendi uniformiter in directum, nisi quatenus a viribus impressis cogitur statum illum mutare.
Investors ?
I resent that term.
An investor is someone who believes in the company, who sticks with it because he believes that the company has a vision, an idea that is useful and worthy.
Today's "investors" are nothing but but money-grubbing Scrooges who bitch as soon as they don't get the returns they think they deserve.
Investor my ass. Go make your own company and show the world just how incapable you are.