
Interesting...
El Reg... just did an article on DPUs...
AMD is said to be in advanced talks to buy Xilinx in a deal set to top $30bn. The acquisition could be confirmed as early as next week, the Wall Street Journal reported on Thursday night. The newspaper cited "people familiar with the matter," which is typically code for someone at or near the top of AMD or Xilinx confirming …
Since Intel did buy Altera, it would very much make sense for AMD to also have a foot in the FPGA game, if there was any chance at all that some important synergies between FPGAs and CPUs (or even GPUs, for that matter) crop up.
And I remember that after buying Altera, Intel then went and bought another FPGA maker... Omnitek.
The question should be "why did Intel buy Altera"? For the IP/product range or as a way of keeping an eye on a competitors (TSMC) products/capacity/defect rates etc?
In a simialr vein, I wonder if this is really about getting the FPGA's into AMD's portfolio or instead taking their share of TSMC's manufacturing capacity rather than driving up TSMC pricing via a bidding war.
At the time of the buyout Altera had gone to Intel for fabrication of their next gen devices (Stratix 10).
However Stratix 10 has a lot of far-reaching design changes from previous generations & was severely delayed (reminds one of AMD's state back then).
But due to the size and promised speed of Stratix 10 (1 GHz!) it looked very promising for data centre applications, and prior to the buyout Intel and Altera had already been doing work on Xeon-FPGA integration.
I _think_ that's why the buyout occurred.
You make an interesting point about TSMC - but Xilinx is still the no 1 FPGA vendor and they make a lot of money per device. On the whole I think it's about fleshing out the portfolio, especially now that Rome is allowing AMD to take a serious crack at the data centre market.
Question for shareholders: is management spreading themselves too thin working on such diverse product lines?
NVIDIA led the GPU revolution from mere display cards to GPUs being a central part of computing today. Could/would they have done that if they were protecting a large-volume CPU franchise at the same time? Tech history has many stories of diverse companies killing off or crippling new products because they would replace/cannibalize their existing product lines. NVIDIA had one line of business, and focused on it very well.
GPU is very difficult to design, build software for, and sell. NVIDIA does all this very well. AMD/ATI does this very well too.
FPGA is very difficult to design, build software for, and sell. Xilinx does all this very well.
CPU is....etc etc etc.
Can AMD management handle all three (CPU, GPU, FPGA) at the same time?
I doubt it; they'll try, but it is a seriously difficult issue. Look at IBM, they are starting to divest unmanageable products/services. Look at Intel, selling off McAfee, and, years ago, network lines.
When IBM bought RedHat, the nay-sayers were rife, predicting the demise of the whole franchise, with good (historical) reasons. It hasn't happened yet, but will.
AMD is following the same road.
Can AMD management handle all three (CPU, GPU, FPGA) at the same time?
I doubt it
What if they were to integrate their existing products AND their senior management in a way that leverages ALL of those technologies and the best skilled people to manage them?
Yeah, I have a bit more confidence in AMD than maybe Intel... especially THESE days!
FPGAs are interesting [I stlll nee to get myself a dev kit of some kind to play with].
It's very likely that AMD would benefit greatly by inserting this kind of tech into their existing products, primarily video adapters and network-related stuff.
Hybrid solutions (i.e. FPGA plus CPU) might be the best ones, when any kind of 'signal processing' or decoding happens. this goes double for WiFi and MPEG. And encryption, in general.
So AMD bought ATI and maybe now Xilinx.
nVidia bought ARM.
So that leaves one large CPU player, Intel, with... who? What graphics chip company could they buy? PowerVR? What large, well-known, billions-of-chips company would match their rivals? Some AI player nobody has ever heard of?
Intel are really going to suffer, especially if "CPU/GPU" becomes a single discrete chip, as it's already heading that way. Who wants Intel HD graphics? Hell, the only decent combination they ever had was Optimus with nVidia and even that could be a pain in the backside. I bet nVidia are far less inclined to partner with them now that they own one of the largest CPU companies in the world, with one of the most popular CPU lines ever, in a multitude of industries.
Intel went another route - hiring Raja Koduri off AMD & building a new GPU team around him. It seems like he has been given his head and allowed to produce a competitive GPU architecture.
Intel's future is so dependent on catching up on process technology. The CPU & GPU (Xe) architectures are up to scratch, but price, power, & performance competitiveness all depend on being on the same process node (or transistor scale) as the competition.
BTW the other interesting thing that's going on is die disaggregation, i/e/ multiple chips in package each of which uses the most appropriate process technology AMD has been very successful here, but Intel has been at it almost as long (but in the Stratix 10 FPGA product line).
"So that leaves one large CPU player, Intel, with... who?"
This doesn't really cover the full breadth of Intel's product line - while CPU's make up the majority of their products, before the 10nm debacle they were looking at dominating a significant chunk of the market from flash to CPU's to analogue (4G/5G) to machine learning.
Who wants Intel HD graphics? The vast majority of the market.... Intel's onboard offerings which are now roughly equivalent to mid-range discrete GPUs from 3-4 generations ago and Intel have increased their market share over the last 10 years from ~50% to ~75%. Yes there is a big gap between the top of the range discrete GPUs and Intels offerings but that is a small slice of the overall market and Intel would be competing against console/AMD/nVidia for that.
The reality is that instead of GPU's, Intel has done "enough" and is trying to compete against nVidia in machine learning. While Intel are losing at the moment, they are constrained by their process technology (again) and historical choices where they focussed on CPU's. Both nVidia and Intel have purchased companies to try and address the limitations and I wouldn't count either out of the machine learning race just yet.
Original Hyper transport systems had the option to put a Xilinx V5 in a socket i seem to recall (some German system I seem to recall) also, it does make sense providing they have a plan. Intel seem to have done nothing with their Altera IP worth noting.
Chiplet based EPYC CPUS could easily have one/two cpu chiplet replaced with pcie/cxl based FPGA chiplets... or maybe even FPGA type fabric inside Radeon chips to allow things like video decoding and ML to be accelerated and reconfigured on the fly.
Lets not forget Xilinx's experience in SERDES, IP that is needed to go to PCIe GEN 5/6 with PAM4 encoding at stupid data rates. They already have silicon running those rates easily.
Their Vitis software does compile OpenCL into FPGA logic reasonably well, well compared to the old C->VHDL stuff that used to exist years ago it's a massive leap.
I'd have suspect more of a Merger than full on buyout however. $30B is a lot of money given the market and history of AMD tripping itself up just when it look like it's on the up..