# AMD promises to spend \$1.6bn on 12nm, 14nm chips from GlobalFoundries

Amid fears the global semiconductor crisis may last until 2023, AMD has opted to extend its purchase agreement with GlobalFoundries, giving it access to a greater proportion of the fabricator's output. AMD disclosed the existence of the deal in an 8-K regulatory filing submitted to the SEC earlier this week. The company has …

1. #### AMD sure ain't Milli Vanilli now!

AMD is way out in front!

1. #### Re: AMD sure ain't Milli Vanilli now!

>AMD is way out in front!

And clearly wanting to stay that way...

Given that Intel's 11th generation chips, what with AVX-512, are not bad despite being still on 14nm, it's high time that AMD stopped having to compete with one hand tied behind it's back! Which, of course, makes it all the more amazing that it forged ahead of Intel despite Global Foundries' failure to be a worthwhile supplier to AMD by neglecting to upgrade its processes.

3. #### Explanation

Could some one explain in simple terms about the difference in these sizes? I take it, that we are talking about the distance between components (not thickness) so is 14 nm better than 12 nm or is it the other way around? Assuming that 12 is better than 14 is the advantage just higher component density?

1. #### Re: Explanation

12nm is better than 14nm

Electricity can can travel 12nm in a shorter time than it takes for it to travel 14nm.

The speed of light is about 30cm/ns, and 1ns is the clock-cycle length of a 1GHz cpu. Electricity is a bit slower than light.

The resistance of a 12nm length of conductor is less than the resistance of a 14nm length of the same conductor, so the chip uses less power

You can fit more components on the same die size, or you can make the die size smaller for the same number of components.

1. #### Re: Explanation

katrinab>The resistance of a 12nm length of conductor is less than the resistance of a 14nm length of the same conductor, so the chip uses less power

No. That is not why 12nm will be 'better' than 14nm.

12 will be an evolution of 14. With a tighter or more optimal process etc. Wire geometries will not change that much if at all. Nor infact will the devices in such a case.

2. #### Re: Explanation

Smaller equals less power consumption and more components in the same space.

Less power consumption also means easier to cool.

The cutting edge these days is 7nm. 14 is positively ancient.

Apparently, there are Intel engineers that say that 2nm is possible. We'll see when they get there.

1. #### Re: Explanation

Sizing was covered a few weeks ago in another chip story. It seems that for the last couple of years the sizes have only been notional. Eg. 14 nm could be the equivalent of 2 layers of transistors fabbed at 30nm but has the number of transistors it would have if they had squeezed them down. 7nm might be 4 layers of 30nm fab or 2 layers of 16nm. Note these are all layers of transistors, not doping layers which may be 5 or more per transistor layer. I cant remember how many transistors they had per mm2 but the point was they had achieved 15 times more than a legacy 30nm chip so they called it 2nm even though it was still using 12nm lithography which i think is about the limit even for hard UV

1. #### Re: Explanation

Thanks all.

4. Whoppee.

While Apple surges ahead in power efficiency and performance at 5nm, we can look forward to a glut of 7 year old 14nm tech flooding the market.

I forsee a marked swing to Apple in the demographics taking their first steps into the computing world.

1. Your name is Gartner, and I claim my £5.

2. There are plenty of applications where 10-20nm is perfectly reasonable and cost efficient.

5nm is great of course for cutting edge. But most do not need it.

1. There are lots of power applications that need 40-130um

You used to get by by using 2 generations old CPU fab to make your switching parts, nobody is going to invest in building a new 40um fab from scratch.

I wonder if the old fabs will continue to be used or if we will get to a point where 'old' 14nm is no good for a mosfet and people need to build 'vintage' fabs ?

3. They clearly say for trailing edge products, not leading edge.

It’s like saying the only car anyone should use is a super car.

4. Now try to build a controller for a car's wind screen wiper in 5nm technology. Takes about one micrometer squared for the electronics and 5x5 mm to be able to connect any wires to it, so you end up with a hugely expensive chip that isn't any better for the task than a chip with 40nm technology.

There's no shortage of 5nm chips. There's a huge shortage of old chips. What AMD is buying here is something in between, stuff that works just fine with 14nm technology and has no need for anything more advanced.

5. we can look forward to a glut of 7 year old 14nm tech flooding the market.

Well, that may not be such a bad thing.

Cryptominers have stripped the shelves bare of graphics cards, so an obvious thing to do would be to pick the last 14nm graphics card and knock them out in quantity. The production costs should be lower on older hardware as presumably less people want to build stuff at 14nm.

I'm an old school gamer and so don't keep up with graphics card releases unless i'm buying, but a brief look suggests that the last 14nm cards from AMD were 4 years ago and kick out something like 50-60% of the performance of the latest flagship cards released this year.

While obviously the last 4 year old cards only kick out a bit better than half of the relative performance of the latest flagship card, not everybody wants to spend a thousand quid on a graphics card; i'd be perfectly happy with a card that offers ~60% of the performance of a new card for ~20% of the price and I suspect there would be plenty of other takers.

And that assumes that they don't do a new 14nm design, or just overclock the crap out of the existing design and apply a much better cooling solution, in which case you might be able to wring a few percent closer to the modern state of the art. Or sell them as pairs or trios for use in whatever they call CrossFire these days which would happily put the combination somewhere near modern flagship levels of performance.

Whatever; there's still plenty of ways AMD can turn a profit using some older fabs that have less demand on them.

1. Crossfire and SLI are dead. Games developers no longer support either technology

Might be useful in theory for compute apps, but then you could just split the workload across the 2 cards

1. Crossfire and SLI are dead.

Yes. Technically that was the manufactures trying to make multiple cards work in DX11. In DX12 it's simply called mGPU and is supported by Direct X rather than the graphics card drivers.

Whatever; the same concept of using multiple GPU's is still possible.

5. Not sure why all the down votes. It's a poor show by Intel and AMD when even their 65W-125W desktop parts get smoked by a passively cooled Apple M1 chip in a thin and light laptop.

In a laptop that's only needed for light browsing, emails and other light Office tasks (and maybe even a bit of light gaming) a power sipping yet incredibly nippy chip is exactly what you need/want.

## POST COMMENT House rules

Not a member of The Register? Create a new account here.