replaced ... by ... CFO
It's the end when the bean-counters are in charge, right ?
#bofh
Intel has confirmed the sudden departure of chief executive Pat Gelsinger, in a move intended to restore investor confidence in the ailing Silicon Valley giant following a year of turmoil. Gelsinger is retiring from the Santa Clara biz effective December 1 after more than 40 years in the industry and will be replaced in the …
But both Gelsinger and the rest of the C-Suite will have nice fat bank balances, bonuses & share options.
What happens further down the food change is not really of interest to them. Intel have lost their way, for whatever reason and should have had a good clear out of the board about 5 years ago.
What happens further down the food change is not really of interest to them.
100%. Consider anyone in a position like this and of a similar age. Plenty of money to have a very, very comfortable retirement. It's up to the next generation to decide what happens next.
On the one hand it makes me a bit jealous but on the other I'd do exactly the same thing if I was in his shoes.
I do often wonder whether people would give a shit about their "principles" if they were offered 100 million to not have any.
I do often wonder whether people would give a shit about their "principles" if they were offered 100 million to not have any
I'll bite : I think I'd make a counter-offer at 10 million and keep my principles. 10 millions are more than I could reasonably spend, I'd have to think hard how to waste such amount of money
Crap! It looks like we're going to keep doubling down on the strategy of firing experienced engineers, then "replacing" them with RCGs (recent college grads) and wonder why we can't build world leading products anymore. And, also to look for cheaper offshore labor, but we still won't pay competitive wages in the "low cost geos" so all we are is a tech training org who can't keep/retain top talent in those areas either. (Damn right, I'm posting this anonymously.)
Mwaah, don't know. They're bleeding so much cash and earning so few that they had to do something. If they sell off the foundry business they're doomed in the long run, but if they don't they're doomed in the short term.
Their stock price is so low they're not contemplating another stock issue, it seems.
Time is simply running out for Intel.
I'm not sure about you guys but I stopped buying their CPU's when added Intel Management Engine with no option to turn it off. I'm not saying that's what pulled them down, but it played a (small?) part.
Pat Gelsinger failed in capturing the money influx of the venture capitalists .. to make his vision of Intel happen over a long period. ( see intel stock market value being evaporated, he couldn't pull a Tesla as Musk did ...)
Most of these investors and big customers are now all betting on Nvidia solutions , Nvidia ecosphere and Nvidia future vision. (there is a reason why there is a Nvidia rush and overhype ... there is non at Intel)
That money now towards Nvidia is even bigger than even Intel + AMD. (AMD just get crumbles of the Nvidia pie, because Nvidia cannot deliver enough chips that the market wants)
And the statement released "by" Gelsinger should actually be called the statement prepared by the corporate communications department in the name of Gelsinger, who was kicked out as the scapegoat for this quarter's bad results. Ever notice how every departure from a C-suite job is accompanied by almost exactly the same statements from the kickee and the board?
To be honest, I'd been waiting for this since the May/June time... Back then it seemed inevitable and I was surprised he managed to hang on as long as he did, many other CEOs have been ousted much more quickly for lesser reasons, although it is a shame, I think his vision was probably the only way Intel can survive in a meaningful way...
Asked uninflected: what's wrong with the shift to ARM?
To somebody on the software side of the industry, it looks like ARM is competitive for performance while being better for battery life and comes with the huge advantage of being customisable by the customer because of its different IP arrangement. Naive as that may be.
"less standardized, and thus more open"
There's at least one xkcd about that, and probably others equally applicable.
Less standards sometimes ends up meaning "harder to use" (or implement); for ARM to-date it has resulted in a twisty maze of ARM boards, all different. Or at least, different enough that system developers and integrators essentially have to treat each one as a separate model type.
Which is why, as others upthread have noted, you usually can't install Linux et al onto an ARM system like you can x86; instead you typically end up having to image someone else's pre-built and -configured OS onto your boot media, and hope it's fit for purpose. Or spend the time (and engineering resources, if you operate at scale) curating and maintaining your own OS images.
Similar complaints from system case builders, peripherals manufacturers, and so on.
I have a Core i5 7th Gen laptop and a Ryzen 1700 PC and an Atom based NAS, sitting in a cupboard, gathering dust. Everything else is ARM, from a few Raspi servers for PiHole, to smartphones, tablets and a Mac mini M1 (same performance as the Ryzen in Luminar Neo, but uses a fraction of the electricity to do that same work). The laptop gets used about once a month by my wife (although she is replacing it with an iPad Air M2) and I've turned the Ryzen PC on twice this year.
Because AMD make Intel x64 architecture processors, if it accelerates the swing to ARM, they are also stuck on making legacy chips for legacy peripherals and software that won't run on ARM.
They'd need to pivot to ARM as well...
For the consumer, all good, whichever way, for AMD and its employees, not so good.
I think that's a serious consideration for the market future. I've got a high spec i7 11th gen Win 11 laptop, and more recently acquired a second hand base spec M1 MacBook Air. The MacBook runs rings around the intel device performance wise, and is completely silent and lasts potentially days/weeks between charging. It's a no-brainer, I won't ever buy an x86 based laptop again, Intel or AMD. Same probably goes for desktop machines TBH.
Despite their money, size, domination and being the designer of the PC CPU, Intel never managed to be a genuine innovator, like AmD or ARM, or even Motorola going far enough back. Every iteration was a bolt on the existing design, maintaining compatibility with software that hardly anybody cared about, or should have cared about.
It's sad to see a key figure in the industry ejected as an offering to the investor gods, he did a lot of good, if not ultimately taking the correct decision about 25 years ago. I await his autobiography, it will be a fascinating read.
Agree. They were kind of a victim of the x86 instruction set and architecture. They actually did an amazing job to take it as far as they have. But the idea that a modern 2024 processor absolutely must be able to run software written in 1979 is kind of stupid. But that straitjacket wasn't really of Intel's making - it was market forces that dictated that. But now they seem to be a victim of it. Companies such as ARM and RICK V have had the benefit of a clean sheet. Intel didn't get that luxury. They did create other processors over the years that were not x86 based, but none of them gained traction.
As a desktop user, I am thrilled that the Intel chips of today can still run old programs, stuff written before "Hello World" was bloated into gigabytes of crud by "modern" development practices. I am pissed that Windows 10+ cannot run old code that Windows 7 could run. I do understand how most servers, with a narrow task, can be limited to recently-compiled binaries, but the desktop track benefits from all the backwards compatibility.
One could also argue that Intel could have innovated in its own garden by switching to a RISC architecture (Arm for example) and have it emulate x86 instructions for older software. Well, what Apple did with the Mx processors that beat the s***it out of x86 : whta prevented Intel to experiment down that road ? Comfort and monopoly.
So no pity here
And they could have called it "Itanium" and "IA-64"
did they run native x86 software ? Think not : Emulation to run existing x86 applications and operating systems was particularly poor
Intel did.
Most of the original x86 instructions are now run as microcode on a RISC based core in Intel x86 processors, and have done since the early Pentium Pro processors.
True this is sort-of hardware emulation, not any sort of software emulation (for that, look at the Transmeta Crueso). Even early 8086 processors had an element of microcoded instructions, but most were hardwired into the processor.
The interesting thing about the M1 and later Apple processors is actually the on-die HBM that makes a huge difference in performance, not the instruction set.
I wonder whether anybody has done a T-state analysis of something like an i5 or i7, and compared it to an Apple M processor, to see whether the latter is actually faster at running a single instruction stream, or whether the difference is in multi-taksing and running several instruction streams in parallel. I must do some reading.
You can write macOS code, compile it for ARM and x86, and run both on an ARM silicon Mac. The Intel version will automatically be translated from Intel to ARM assembler code using Clang.
The result runs typically at 80% of the speed of an ARM processor. So thats at most the penalty: 20% slower after translating assembler code.
Intel processors are just not very fast. If you look at performance per Watt they don’t come close to Apples ARM implementation. There are people who use Apple Silicon because it runs Intel code at same speed for much less power.
> Intel processors are just not very fast.
And that's a load of hooey. My ultrabook's Intel chip is faster (higher clock speed by half a GHz) than my friend's M3 Max. It has more cores, too. The only thing it doesn't compete on is power consumption, but oh well. Once Linux is fully ported to the ARM notebooks Apple makes, they might even be usable!
Sweeping generalizations generally don't add anything to the argument.
After IBM had their ass handed to them over Micro-Channel, very few in the industry had the guts to produce new technology that was not compatible with previous hardware/software.
This went on for years until ARM and Apple finally were brave enough to sat "we can't keep doing this!" The biggest blame is Microsoft and their entrenched corporate customers!
Recent events have shown this cannot go on much longer! If Intel/AMD are gong to survive they have to bring a new generation of processor to market that are not compatible with the x86 architecture!
"After IBM had their ass handed to them over Micro-Channel"
You mean after IBM attempted to corner the market with a proprietary and undocumented interfacing standard and discovered they were no longer the 9000 pound gorilla in that market
ISA took off BECAUSE it was open. MCA died because it wasn't
PCI didn't have the same kind of problem
The predecessor (Swan) had a tough ride, not being his fault they had to reheat Sky-lake,lake lake lake lake, and similarly, though alder lake isn't attributable to Gelsinger it seemed they had the auspices of being on the right track. They have competitive core products (but have to use TSMC to get them). These are big changes to the model that I think I'd argue made intel's past success.
I have to wonder if anyone other than someone like Gelsinger could have pushed them to that - or if we're now back to pumping stock with buybacks, than expensive fabs & development like 10yr ago when they seemingly spent the investment on there instead, and they'd otherwise be inheriting now.
not a good sign for intel.
but at least they get a $8bn performance bonus from Sam (yeah i'm sure they're 'not allowed' to buy their own stock, or sell for magic beans, like anyone will be around to answer for that). I await to see if they now double down on financial rather than product engineering. I don't know that i'm hopeful, the only reason I can think of to kick Gelsinger is they didn't want to stay that course.
I reject the notion that each iteration of x86 being a bolt-on to an existing design eliminates any possible innovation, necessarily ignoring any potential contribution by Intel to the various strides in superscalar, speculative and out-of-order execution over the decades.
You're also ignoring Intel's often-commanding lead in process and its frequent attempts to branch out — the i960 was fairly decent and reasonably popular, and the Itanium was at least a big swing. A big swing and a big miss. But they tried.
In the late 1970s, I toured HP's Corvallis, Oregon, USA facility. Among other things, they'd made a to-scale, twelve-foot-tall version of their HP-32E calculator. The giant version was fully functional, AND, had that great clicky-feeling tactile feedback when you pressed the keys.
Their HP 2000 and HP 3000 miniconputers will live on in SIMH.
"Intel never managed to be a genuine innovator, like AmD or ARM, or even Motorola going far enough back."
Really? The Pentium Pro was sufficiently innovative to make "x86" compdtitive with RISC (and wipe out most if the RISCs over the next decade). Itanium was also innovative, just wrong. In response, I'll credit AMD with having the guts to step up to an empty plate and deliver Amd64.
But ARM and Motorola? Haven't you got to go back to the 1980 to see their innovations?
Mind you, I'll happily concede that my examples are a qusrter of a century old. Where's the innovation in modern CPU design? Should I be ignoring CPUs entirely and looking for innovation in GPU design? But if I did, would I even find anything there that isn't 20 years old?
The reason the x86 survived was because it was faster than its competitors and it was compatible.
The 286 beat the 432, the Pentium 3 and 4 beat the Itanium. I don't know about the i960, it had great floating point.
The same goes for 486 vs 68040. The DEC Alpha chips were slightly faster than Pentiums, but very expensive.
It wasn't "faster" than its competitors, it had a money advantage that allowed Intel to invest a lot more into developing it and more importantly to gain an advantage in fab technology. A single generation lead in fab technology back then meant a CPU that was twice as fast.
The competition wasn't in the PC market where the volume was, they were in the workstation/server market so their better architectures were pushed aside one by one as x86 muscled up from below.
That's exactly the same thing that's happened to Intel now with ARM, which isn't "faster" than x86 but Apple alone sells more iPhones than the PC market sells PCs, and at a higher average revenue per unit to boot, and that isn't even close to the dominant smartphone platform unit-wise. Intel can't compete with the flood of money being invested by Apple, Qualcomm, Samsung, Google, Huawei, Mediatek and other smartphone industry giants.
If they had a visionary as CEO he would have seen where things were going and made sure Intel was part of that future, even if it meant abandoning x86 to do it. Instead they had caretaker CEOs who didn't want to upset the bureaucracy that had built up underneath - and its #1 rule which was "x86 is the solution to every problem". So they tried to foist x86 cores onto the smartphone market, which utterly failed.
If you have a process advantage, it's silly not to use it. But in the early 1990s it seemed like x86 could not be scaled up any more and it would lose to those simple, scalable RISC architectures, process generations be damned. The P6 microarchitecture was pooh-poohed from the start, but its frequency scaling from 150 MHz to 1.4 GHz in six years won the 1990s RISC vs CISC wars for Intel when it came to hitting the same ballpark as the best RISC could offer in peak performance (power efficiency is another matter, of course).
Sure, the Alpha might have scaled farther in frequency if the development had continued, but looking back it would have surely hit a similar power wall as the Pentium 4. And as the process generations advanced, the x86 "penalty" in transistor budget became less and less important.
So for me, the P6 was right at the edge of what was possible to design at the time, *and* it was able to make a pig (x86) fly. And later, after the Itanium/Pentium 4 dead ends, the old dog could still be developed onward to the Pentium M/Core microarchitectures.
Yes, AFAIK NexGen also did similar things with x86 instruction set being internally compiled into RISCy micro-operations, and was out about a year earlier, so maybe it was more innovative in that sense (and was bought out by the "innovative" AMD to act as the basis of their CPUs going forward), but considering the timelines this was clearly a case of parallel evolution rather than Intel copying.
"The reason the x86 survived was because it was faster than its competitors and it was compatible."
Just the "compatible" part - and that was simply because it dominated the desktop environment
x86 was slower per clock and per watt than EVERY other competing cpu out there. The others were beaten into server or embedded equipment niches and then died when Intel went after those markets too
MIPS might have stood more of a chance if the Longsoon versions were fully licensed sooner (they actually run emulated x86 almost as fast as real x86) but the very comfortable Windows/Intel partnership essentially destroyed everything by leveraging their effective consumer monopoly into driving competition out in all other CPU spaces
Had Intel taken power consumption a little more seriously in the early 00's it's entirely possible that the entire ARM phone ecosystem might have been stillborn
You do know that the Intel lost a suit to DEC for stealing design tech for the Pentium Pro?
New York Times 1997:
"Mr. Palmer said yesterday that Digital offered to license the Alpha chip to Intel in 1991, when Intel was looking to improve the performance of its chips. He said that Intel looked carefully at Alpha before deciding not to use the chip.
When Intel introduced its Pentium Pro chip in November 1995, Mr. Palmer said he was surprised by the new chip's substantial increase in performance. His suspicions grew last August, he said, when Andrew S. Grove, Intel's chief executive, and Craig Barrett, its chief operating officer, seemed to admit in an article in The Wall Street Journal that Intel took its chip designs from others.
''Now we're at the head of the class, and there is nothing left to copy,'' Mr. Barrett was quoted as having said."
https://www.nytimes.com/1997/05/14/business/suit-by-digital-says-intel-stole-pentium-design.html
They did innovate a few times, whilst maintaining compatibility - probably their biggest mistake. The Israeli skunkworks that came up with the "M" mobile cores with more efficiency was a fairy big redesign. But, yes, in general, they have just iterated or squeezed their legacy stuff into "modern" concepts, like Lunar Lake.
Intel's core asset was software compatibility with the enormous trove of software written for x86. It's also the reason why Microsoft became the dominant software vendor, because both realized that people don't want to throw out all their hard- and software every once in a while just because vendors believe they invented a better mousetrap.
In that sense I believe Intel did the right thing: they improved the performance of their architecture enormously over the years whilst keeping the ability to run legacy software. That's a feat in itself. It's easy to improve performance with a clean-sheet design. Not so if you have to keep the ability to run 40+ year old software.
You doubt me, do ya? Well according to [1] I'm 100% right.
Apple's market share is now well below 10% and the tipping point is drawing closer that most ISV's will find it unprofitable to develop for Mac OS X.
[1]: https://appleinsider.com/articles/24/10/09/worldwide-mac-sales-dropped-in-q3-2024-while-most-pc-vendors-gained
But in which segment of the market would you say they are the leaders?
In servers, Epyc is way ahead of Xeon
In workstations, Threadripper is way ahead of Xeon-W
In gaming, it is a bit more balanced and subject to opinion, but Ryzen is certainly competitive
In mobile, Apple is way ahead. If you want to run Windows, Apple is probably still ahead, but AMD and Qualcom are certainly competitive
In regular desktop, it is a dying market, and a 10-year-old CPU from any manufacturer is perfectly adequate so it really comes down to price.
>>. PC Gaming is a niche.
It's one back of a niche. PC gamers spend big bucks on high-end gear (CPU, multiple GPUs, overkill power supplies, displays, etc.) And every few years, do it all again, whereas a regular desktop or laptop lasts a decade or more.
Gaming is 80 billion a year...
Because Intel is controlled by share prices now and short-term investors want line go up. Meanwhile fixing intel's cpu woes takes a long time. Clearly Pat had the experience and know how plus did seem to be starting that transition but line went down so off he goes. In comes the accountant and no doubt he will ride the crest Pat made then eject before it falls again.
Eh-eh-eh! I can remember posting this (Major shoes to fill?) almost a year ago ... the Lisa Spelman bit was wrong but, what do I win?!?!? (some swag would be so nice, right in time for x-mas!!!!)
Intel's whole model was reliant on the Wintel hegemony. They benefited immensely by this lock in. If you are really sympathizing, do not forgot all the chip makers that went bust because of this monopoly. (DEC, SG, Cyrix, Motorola's x68000, Solaris, Amiga, Atari). Intel only thrived while Moore's law was applicable. That law has broke down many years ago, Intel is coasting at this point. It is too late for this behemoth to correct ship. Pat's parting sob love letter is just his nostalgia kicking in.
The lock-in is still present today, up to a point. But I think it's MS that rules the market and gives peanuts to its vassals (intel and amd) with the windows 11 forced PC replacement. Probably it was not so 30 years ago, but it's been like this since at least 20 years. Businesses need to run Windows, and Windows needs (well, needed) X86 or AMD64 cpus to run. Today windows can run on ARM, but it's a very marginal market anyway.
Mobile is a very different story but that ship has sailed for MS and Intel (and AMD) 20 years ago.
It was also reliant on Intel having a monopoly on the chips needed to build the high margin servers. Apart from Xeon you releid on Intel for high performance ethernet, SATA and south bride chipsets - if you didn't play nicely with Intel on your entire product line you didn't get to play in the server market.
Not unlike Microsoft's - you ship DOS/Windows on every machine or you don't get any Windows OEM licenses
True, Solaris is the OS. I should have said Sun's SPARC which was their own microprocessor. Sun Microsystems had to contend with competition on both the hardware and software, SPARC against x86 PowerPC and Solaris against Linux, BSD (you could also argue Win NT which would be Wintel).
The Hitchhiker’s Guide (Douglas Adams) talked about the race that got rid of the useless one third of their population, the telephone sanitizers. These people landed on Earth and became us. They decided they needed to invent fire, so they went and formed a focus group to work out what colour it should be. Those are the people who have invaded Intel. That’s what’s gone wrong.
He did well helping Intel execute in their process roadmap, but his strategy to turn Intel into a foundry was doomed from the start. The clear winning strategy is their old biz model where they earn exceptional margins by designing and also manufacturing leading products. It fell apart when they lost their process lead. They need to get that process lead back so they can get back to winning. The foundry thing never made sense. That’s lower margin, requires more command and control management and employees willing to take a lashing to stay competitive. I don’t think western employees are built for that.
The problem for Intel is that CPUs aren’t the biggest chip market anymore, so they can’t generate the cash needed to have the leading process tech if it’s just being used to dominate CPUs. They’ll need to win in GPUs to be number one like the good old days. The best strategy for them is to make sure they take the process lead and then use that fleeting opportunity to make the best GPU for AI on the market. They pull that off and they can become a trillion dollar market cap company. Otherwise any other outcome will result in them eventually falling behind tsmc again, and they’ll eventually become irrelevant
I think there's still a market for their foundry business. The U.S. military wants a second source for its part (which don't need the latest nodes) and are more than willing to bankroll it. They just need to get nose to nose with TSMC. That may require some work, but it's doable. The U.S. government could even pressure Taiwan into giving up its secrets and help intel get its foundry business off the ground.
not for much longer.
The USA believes it's still the largest market in the world but the reality is that it's now one of 5 around the same size and continuing to behave like the 900 pound gorilla with a bad attitude is increasingly alienating its friends and allies as well as its frenemies
Trump 2 may well be the last straw. We already saw a lot of logistics being rejigged away from USA owned/influenced transport chains in 2020 and whilst that got put on back burner status during Biden's presidency, I've seen a lot of ramping up since Nov 5th
Mercantilist mentality tariff/trade wars are directly what led to both WW1 and WW2 - more obviously so for WW2 as events for the former started 50 years before things started getting "hot"
If only Intel had spent the Billions of dollars they incinerated in share-holder buybacks and other financial engineering on actual real engineering, they would likely not need the bailout from Uncle Sam (CHIPS Act) and might actually have a product that customers want. But Wall Street knows best (until the company is gutted, that is).
Was wondering if process node leadership can be bought back.
Latest ASML machine costs $450 Mil a piece. A18 faces astonishingly complicated innovations to work.
Troubles begun with 14nm well over a decade ago. Otellini’s purview.
Then they lost the smartphone market explosion.
Never a serious contender in the GPU market, they also lost the most recent NPU market explosion.
They are allegedly working with AMD to streamline AMD64. But with the proliferation of co-processors I am not sure it matters anymore.
Apple demonstrated ARM is viable for mass market general-purpose computing. Watch to desktop workstation. Designing it’s own SoCs, partnering with TSMC and many others in Asia for production demonstrated incredible efficiency. Both in products and financials.
Nvidia, Amazon followed suit.
I would say the fabless model has won. What can the US government do to extricate Intel, ailing vertically integrated silicon designer and producer, from Intel Foundry? The asset they want to retain control of and also to thrive.