back to article Intel keeps heat on AMD with Tri-Gate transistors

Can a 32 nanometer Bulldozer jump a 22 nanometer Tri-Gate transistor to cross an Ivy Bridge? In 2012, we're gonna find out. Advanced Micro Devices and its foundry partner, GlobalFoundries, are just ramping up production on 32 nanometer wafer baking processes, in theory catching up to Intel, which has had 32 nanometer processes …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Happy

    A little birdy told me

    ...that "Intel isn't expecting its competitors to move to a similar technology until 14nm."

    So it looks like this isn't a permanent/exclusive advantage for Intel - but for the next few years definitely appears to be an edge.

    When I was reading this I was thinking "wow, imagine what an ARM proc could do with that type of technology" - more battery life without compromising performance... where do I put my name down on the waiting list for a netbook/smartbook that can run for 24 hours straight without plugging in?

    1. Arctic fox
      Unhappy

      Re A little birdy told me.

      Given the accerating pace of development in this area it is entirely possible that the move to 14 nm may come rather quicker than one otherwise would have expected. As for ARM-based technology I think we can be reasonably certain that someone somewhere is working on it! What about a 7 in form factor tab, 2.5 Ghz quad core running Win 8 or OSX (according to taste) with the type of battery life you mentioned? Now *that* would indisputably be a proper tablet *pc*!

    2. Anonymous Coward
      Anonymous Coward

      Not the only ones

      A little naive to think that Intel was the only ones working on a 3D transistor. IBM, Ti, and the other big players have been at it for around a decade, just like Intel has. Just because Intel is first doesn't mean that they will have the best transistor.

      1. Anonymous Coward
        Flame

        Re: Not the only ones

        The article made no mention of *anyone* else working on anything similar, and since I'm not an EE/CE or otherwise particularly savvy when it comes to processor fabrication I had to go digging to get that perspective on the issue. According to the comments at Anand TSMC calls this FinFET. In particular this reference is what states that everything over 14nm with TSMC will be planar - http://www.eetimes.com/electronics-news/4213622/TSMC-to-make-FinFETs-in-450-mm-fab

        I never said Intel was the only one working on this, or that they will have the best transistor - just that this appears to represent an edge for the next few years until the other fabricators catch up. I think it's a little more naive to think that everyone who reads this article knows that this concept is not patented or otherwise exclusive to Intel as we both have pointed out in our comments.

        IMHO, I think the article should have included this information to begin with and the only reason I put the previous comment in was that I had a hunch others might have the same question and, like me, have never heard of 3D transistors or FinFET before.

      2. ChrisInAStrangeLand

        re: Not the only ones

        What TI and IBM "will have" is irrelevant. Intel has the best transistor on the market. Period.

  2. Arctic fox
    Headmaster

    Just goes to show.

    We should be very grateful that Intel did not succeed in driving AMD out of business a decade or so ago by methods which (if I ruled the world) would definitely mean jail-time. Anyone think that Intel would be putting this amount of sustained effort (read money) into innovation if the continued existence of a significant rival didn't force them to?

  3. Arctic fox
    Happy

    My second post was intended to have a happy face attached!

    Sorry about that!

  4. Matt Bryant Silver badge
    Boffin

    Pricing vs Technology

    AMD isn't "dead in the water" just because their next gen may be a step behind Intel's, the market works on more than just raw CPU performance. All these Intel advantages may be a bit diluted by the time they get into systems as overall system performance and power useage is across all the components. Remember the first generation Atoms - low power but stuck with a sucky mobo chipset? And then there's the fact that we don't always buy the top performance option, we usually compromise on price vs performance. Especially the consumer market, where AMD first made its big impression, where "good-enough-but-cheaper" often wins. Most consumers won't know or care if the Intel-chipped PC, laptop or notebook is 10% faster or 10% more power efficient if the AMD-chipped option is 10% cheaper. Most consumers aren't informed enough to know what Intel or AMD even are, they just look at the pricetag.

    In servers, it will be a more complex battle. AMD looks like it will be more core-dense, which will be more of an appeal for certain applications, and if they can survive pricing their chips lower than Intels then they may still mop up a large chunk of the market. Then there are the die-hard "I-won't-buy-Intel" buyers, who will still buy AMD even if the Intel option is better, just because they don't like Intel. As long as AMD can keep the main vendors shipping systems with their chips they should survive quite well, and easily long enough to get their own 3D chips to market.

    1. Anonymous Coward
      Anonymous Coward

      But the "price" part is a problem for AMD

      AMD's and Intel's prices are set by the market, and are not driven so much by the manufacturing cost.

      The big problem that this brings AMD is that Intel now has much lower manufacturing costs for a similar performance part, and Intel was well ahead of AMD on cost before this. Being able to make parts with twice the wafer density for only a 2-3% cost hike is a pretty big deal. This ends up putting a huge amount of margin pressure on AMD.

      1. Anonymous Coward
        Anonymous Coward

        I think AMD will have a field day in the consumer space soon.

        If we assume that AMD's marketing team are not completely brain dead, all they need to do is provide some demo's of 3D games for their next gen (arriving within next month or so) Laptops and Desk top PC's.

        When Joe Bloggs goes down to PCWorld and sees one set of Laptops running flashy 3D game demos and is told by the sales guy that the others that cannot do it, the slight CPU speed difference will not be relevant.

        Even if they never play any games the flashy demo that cannot be run on the Intel kit without a massive price hike for added NVidia chips will likely get the sale. People are always more willing to pay for what they can see, over numbers on a printed sticker.

  5. Mr Young
    Happy

    Amazing!

    What with all the bad news and stuff, us humans seem to like putting ourselves down. Or maybe fighting? Or whatever? Anyway - a 22nm processor - feck me! Just how many transistors per square mm is that? I'd at least hope it would impress some aliens if the decided to drop in for a visit

    1. MEM
      Happy

      MEM

      Intel seems to think they can do a three dimentional technology and yield it with a 2 to 3% increase in cost and get 37% increase in performance. I doubt that yield and manufacturing numbers very much, but Kudos if they can.

      I know of a new technology by a company called Semi-Solutions, which seems to reduce the leakage of the planar transistorsto similar values and provide for low voltage operation similar to the intel claims for their threeD. May be AMD and others will use this to catch up. Being planar will certainly keep the process know how and yields up where it should be.

      .

  6. RobDinsmore

    What does the 2-3% cost increase refer to.

    Does this statement mean 2-3% more per wafer than a 22nm planar architecture or 2-3% more than Intel's 32nm process? I am betting the former as I am pretty sure they had to add another metal layer or 3.

    I still do not understand Intel's mobile 'strategy.' I really think they need to try harder to make an impact there. Their Atoms will likely be the last chip to transition from 32 to 22nm and will have to compete with TSMC/Samsung's 28 or 20nm fabbed ARM chips which already own the mobile market and will be running Windows 8 at that point which until now has been Intel's only advantage.

  7. Mage Silver badge

    Intel Mobile Strategy?

    They lost that one by

    1) Not developing the i960 into a Mobile part

    2) Flogging off their ARM stuff

    3) Sticking to recipe of ONLY aiming for more transistors, clock, benchmark on CPU

    4) Belatedly looking at SoC. ARM success is partly that you get a whole system (many different ones) that simply happen to have an ARM as only one part.

    x86 is a deadend architecture that is holding back Computing. Has done for 30 years.

  8. CADmonkey
    Unhappy

    speed junkies choice?

    As a (non-rendering but 3D) CAD user:

    I don't want something I can carry

    I don't want to save energy

    I don't want more cores

    I just want my application to run faster

    This seems to put me 180 degrees away from current trends and marketing blurb, making for a difficult choice.

    Am I the only one?

    1. Doug 3

      you can't have your cake and eat it too

      more cores is how they make it go faster because they've topped out on clock freqs and the only other way is with die shinks. And besides, less energy usage does not mean they are "all green" and shit. Less energy usage also means they can put more cores because another issue you're missing is getting rid of heat. You don't list "I don't care about heat or heat sinks". Less energy usage means more cores and that means more power if your OS and apps can understand multi-core and use that efficiently. CAD? oh, you must be using Windows, sorry for you.

      1. CADmonkey

        Thanks for the reply (and the cake, whatever that represents),

        ...but it would have meant a bunch more if you'd read my post first, or indeed the article above it which talks about "the only other way" ie shrinking dies. CAD programs are (so far) mostly single threaded, which is why I don't need any more cores. I don't want more efficiency, I want more power. 10 years ago that was everone's priority, but not anymore. Simply put, the less time I spend waiting for the command prompt to come back, the more money I make.

        You're right, I didn't mention heat. But only because I don't really care about it. Who bases a CPU buying choice on the cooling ability? If it gets too hot, stick a bigger fan on it.

        Your commiserations are also misplaced. Plenty of non-Windows CAD apps are available, and anyway I fail to see the relevance of my OS, or the need for you to put me (and millions like me) down for it. You think I give a toss about your 'real man's' operating system that statistically, nobody uses? :P

  9. DarkStranger
    Black Helicopters

    Interesting article, but?

    Interesting article, but I am a bit confused at why you choose the phrases "peddle its processors" or "will start bragging" to describe some of AMD actions. Doesn't Intel also "peddle its processors" and engage in bragging?

  10. Anonymous Coward
    Happy

    Intel has a cross licensing agreement with AMD

    Hi, this agreement AMD will share the 3d transistor but it will come out later, or AMD will come up with something better. If you follow the design improvements, Intel has gotten alot of good technology from AMD. Like AMD64/EMT64 from AMD. Memory controller on the cpu, etc. Intel needs AMD for antitrust issues. Like Microsoft needs Apple. AMD is still much cheaper in cost. AMD rules!

This topic is closed for new posts.

Other stories you might like