Everyone kept saying the US would never stop one of theirs from a monopoly and it would be the UK or China that end up being the matter. If I'm not mistaken this is actually the first official block being put up?
The US Federal Trade Commission, having previously expressed unease about Nvidia's plan to acquire UK chip design firm Arm, acted on its concern Thursday by suing to prevent the deal. "The FTC is suing to block the largest semiconductor chip merger in history to prevent a chip conglomerate from stifling the innovation pipeline …
Exactly, and Apple too. Reading "Nvidia's $40B deal for Arm could affect Apple" [Compterworld]. One reason is possible increases in license fees on existing products -
[Jack] Gold [an analyst] also pondered the financial implications of Nvidia's purchase. "Do they want a heavy-duty payback?" he asked during the interview, referring to possible higher licensing fees. That, too, could put licensees like Apple in an unexpected spot.
Unfortunately, I do not believe that the long term future of Arm is necessarily safe even if the Nvidia merger does not go through. Once the internal decision was made to sell out for a profit (to Softbank in 2016 for 23 billion) the pressure is on for whichever investor owns it to turn an near term profit.
If Softbank needs cash (which they do) what will they do next? Sell it in parts? I could easily see a hedge fund purchasing some part - figuring out that maximizing short term income means jacking up fees for existing licenses while downgrading staff and investment until new sales drop and then selling of their part to NVidia, or even Arm China.
It's easy to say "don't do this merger" (and I agree with that), it is harder to find an owner who has the patience and resources to ensure the necessary continuing investment in R&D and a long term approach to sales. There WERE some owners who could do and did do that, but they already sold out for a big one time paycheck. Softbank has been a good enough steward since then, but now they are desperate for cash after a lot of other not so good investments went bad.
what would the advantage be for NVidia to do this deal?
S.O.C. bundling of NVidia tech comes to mind. Broadcom does this for the CPUs used by RPi (with videocore), by the way, along with other 'integrated things'. I wonder if it would be more competitive to have an NVidia ARM CPU like this, and more efficiently made when ARM designers are integrated into the process ?
With a lawsuit, perhaps a compromise would prevent future abuse? Or, more likely, it would just enrich the L[aw]YERS as a TOLLBOOTH on the road to success.
You hardly need to buy all of Arm to develop SOCs based on the technology. They're actually a rather big business already.
No, the only "advantage" I can see is if NVidia were to hold back any innovations needed to integrate their GPUs with ARM CPUs, leading to the competition issues being raised.
ARM is not the only player in this space. They have reached some dominance in the low-power embedded world, but the bulk of the general purpose CPU market is still INTEL with AMD biting the heels. Apple is making yet another shift 65K to Power to Intel and now custom ARM but they could easily move back to Power or Intel/AMD, or adopt MIPS or up and coming RISC-V. All are viable architectures.
Nobody is designing fresh CPU architecture the their garage. AMD pretty much has a much larger footprint in the CPU+GPU space. I would expect NVidia to put out designs for Apple wedding their GPUs to ARM cores like the current crop of wildly popular AMD APU's, but with more emphasis on the GPU side. The latest MALI GPU designs are better, but still not in the same league as INTEL's latest, much less NVidia orAMD. If they can build the synergy, I think it would be good for competition, but maybe bad for Intel. These days it's more about the FAB's and yields. The designs get somewhat better spurred on by physically smaller, electrically more dense dies. Should be interesting
In the Joules/Gigainstructions performance indicator, which is what matters most where MALI-GPUs are used, they have a 1:10 advantage over Intel and still more than 1:5 on (today's) Nvidia.
While they are good enough for what they typically are used for, this is one of the areas where the merger could actually generate better products. Not a good enough argument for me, though.
"They have reached some dominance in the low-power embedded world, but the bulk of the general purpose CPU market is still INTEL with AMD biting the heels. "
Some dominance? There's an understatement if ever I saw one...
Whilst it's true that ARM are still small fty in the desktop processing sector, it's worth bearing in mind that whilst the estimated sales figures for AMD and Intel combined last year are in the few hundred millions, for ARM core devices it's in the order of 25 billion, thanks to their near total dominance of every sector below desktop computing.
Because once you drop down into the realms of smartphones, tablets, set top boxes, smart TVs, smart speakers and white goods, let alone the realms of the things most people would consider to be properly embedded devices such as HVAC controllers, ECUs, network extenders and suchlike, there's now an exceedingly good chance that for the types of products which used to be traditionally the domain of things like the AVR, PIC etc., you'll see an ARM core device in there instead.
As the range and capabilities of ARM-core devices have continued to expand into areas formerly the sole domain of other architectures, there are now very few occasions where the choice of something other than ARM is driven by genuine technical requirements and not simply out of fear of the unknown or an unwillingness to give up on whatever other architecture was traditionally used by that engineer/R&D team. ARM has steadily been encroaching onto the really low power end of the embedded space for a while now, and with Apple's latest move they're also starting to push upwards into the desktop space.
And let's not forget that the desktop space was where it all began for ARM...
Some dominance? There's an understatement if ever I saw one...
Even though I suspect KSM-AZ is a merkin, understatement is not the preserve of the British. Nor should it be. Understatement helps us all keep our little grey cells working.
It isn't silly, it makes sense.
Firstly it is bad for the GPU market, as NVIDIA may squeeze Intel and AMD in that area, as they might prefer their own CPU.
Secondly it may restrict long term innovation in areas such as AI. NVIDIA GPUs are clearly substandard for this purpose, and so they shouldn't be allowed to take dominance via blatant commercial piratering.
I think he was confused by the fact that the original Apple ][ was a 6502, but no one ever called this a 65K. Even the subsequent upgrades to the 6502 never got called 65K even though they had 5 digits, because someone decided they should all have a pesky C after the 65 to show how newfangled they were by being CMOS instead of TTL.
Oops, Been a while there 68050 or was it 030? I think. Motorola. Full 32 bit Big Endian.
To the MALI point, ARM owns much of the embedded markets because of performance/watt, price, familaity. But the ARM markets are fragmented, and everyone is basically dumping an arm core onto custom silicon, custom boot, custom support logic. There are plenty other fabless core designs that can be welded onto custom silicon. If ARM gets uppity it just wouldn't take that long to roll to something else. The compilers and such are there. There is plenty of non-arm embedded stuff out there.
Apple,Samsung, et al all have Iron Clad agreements for the core ARM tech. So if things stagnate there, they can keep going and migrate to something else as the next new thing in CPU shows up. MIPS did merge with Silicon Graphics after all, MIPS-3D exists. And VIA is still around. They have some reasonable GPU tech that could come up to speed. As INTEL is discovering, a small mis-step, hiccup in this business and you find yourself playing catch-up.
It will be fun to watch!
> Nobody is designing fresh CPU architecture the their garage.
Thank you so much for the CPU and garage insight. Blew my mind. Here I was thinking everyone was designing CPU's in their garage, in their copious free time.
So you are saying the ARM investors, are a bunch of tree-hugging nice people, who just want whats best for the world, profit be damned? They are obviously not looking for market dominance, because they love competition. Really? They've got Apple on board, AWS is fielding GRAVITON-2 cores in it's DC's, I'm not sure who is buying who really. YMMV
ok - you can start an investment firm to buy it maybe? And it will need R&D funds to invest in future product development or you'll end up doing harm to ARM [bad poetry] in the long term.
(large companies like NVidia have deeper pockets, especially when it comes to R&D budgets for new product development)
Becasue it OPENLY APPEARS to stifle competition.
Competition is stifled everyday in the U.S. In fact, in most of the capitalist world, but mostly in the U.S. there are only 6 companies in each major industry running almost ALL of each category. It just APPEARS you have a choice because each company may hold dozens to a hundred "subsidiaries" brands.
Biggest companies are constantly in a state of flux, Again, if they drop the ball, someone else steps up. I still struggle with some of these social media giants. I personally don't use them much. Just a Luddite i guess. NVidia is pretty narrow. At present they are somewhat better than the competition, but AMD for one has the upper hand (If they can keep it) with the APU synergy. Intel has made drastic (and I do mean drastic) improvements in their GPUs, still not there, but they have smart people working for them they can leverage to keep those improvements coming.
Currently CPU tech is so commoditized it's becoming increasingly irrelevant at the consumer level. The sorriest laptop you can buy is still pretty good for most folks. GPU tech is starting to get there as well. The majority of workloads to not require running a game at 200fps. So we are increasingly talking about edge cases for tech improvement, cloud server farms, bitcoin mining, etc.
I'm currently running an AMD 3400G on a couple desktops. They are ridiculously fast for everything I want to do. The 5000 series stuff is even more better. Why not get ARM/NVIDIA to put together something similar with less power usage and equivelent performance. I'm still trying to figure out which competition is getting stifled by them merging. ARM architecture is already very dominant. What can they do combined with NVidia they can't do anyway?
Biting the hand that feeds IT © 1998–2022