back to article Intel updates ATX PSU specs, eyes PCIe 5.0 horizon

Intel has detailed new ATX power supply unit (PSU) specifications that it says are designed to support the demands of upcoming PCIe Gen-5 graphics cards, while also delivering greater efficiency. According to the chipmaker, the new ATX 3.0 specifications are intended to "unlock the full power and potential of next-generation …

  1. Anonymous Coward
    Anonymous Coward

    Blast, I just purchased a new modular 700W PSU, would have been nice to future proof it Wonder how much of a change it is or is it just "new style connectors"

    1. Sandtitz Silver badge

      The article mentions a 'telemery pin' which probably means a new connector.

      Is it optional? I haven't read the specs, please do and report back.

      The 6/8/12 -pin GPU connectors are specified to deliver 75W/150W/500W, and the new specs call for a 600 watt connector, so yeah, it will most likely require a new connector as well.

      1. JimmyKleenex

        It's a new connector with a smaller 4-pin sidecar connector for the logic parts. I'm sure adapters will be included with every new video card for the next 15 years. Tom's has a writeup.

  2. Elledan

    Still not seeing the point of ATX12VO

    The problems of 12V-only PSUs have been covered in great detail since last year, mostly related to the clunkiness of moving part of the PSU to the mainboard and doing POL power conversion. Add to this the upgrade issue of ATX & ATX12VO PSUs and mainboards not being compatible (without an adapter that nukes any claimed efficiency benefits), and it becomes less of a 'let's use this different PSU' and more of a 'who wants to manage two distinct inventories?' question.

    I'm more excited to see the new efficiency standards making it into ATX 3.0, which should get that standard closer to ATX12VO's standby efficiency without requiring new mainboards (and associated e-waste).

    Having some feedback channels back to the PSU beyond its sense wires is also good to see. All too often the moment when you realise that the PSU cannot provide enough GPU juice is when something like Furmark or an intensive game causes the PSU to bail out to protect itself.

    1. Steve Todd

      Re: Still not seeing the point of ATX12VO

      What do you think uses 5v and 3.3V power these days? All the heavy lifting in current systems is done by VRMs stepping 12v down to whatever the required voltage is (and at pretty high orders of efficiency). Putting extra VRMs on the motherboard for 5V and 3.3V for the (comparatively) low demands on these rails allows for a simpler and more efficient PSU, at the expense of a slightly more complex and slightly more expensive motherboard. It’s a reasonable trade.

      1. Anonymous Coward
        Anonymous Coward

        Re: Still not seeing the point of ATX12VO

        ".. the expense of a slightly more complex and slightly more expensive motherboard. It’s a reasonable trade."

        Is it for me? Or is it for you, the gamer with a $2,000 GPU?

        The only reason I see 12v on breakout boards for things like Pi Compute is solely for a possible PCIe.

        To constantly buck/boost ALL other devices for that single 12v PCIe device (that I won't have) makes me wonder why that single device isn't powered externally or by another rail... oh wait that's how it is now, not how it's about to be... sorry :-/

        Upshot is, finding cheap and reliable 12v 1000watt server PSU's ain't no thing (for now). Of course that is until that signal is a REQUIREMENT and nothing works without it (thus birthing a separate hardware dongle to emulate that).

        1. Solviva

          Re: Still not seeing the point of ATX12VO

          You do realise CPUs can demand a fair bit of current @ low Vs? Getting that power supplied via 3.3V or 5V (where it's anyway down converted to whatever the CPU wants) would require significantly thicker and/or more wires compared to pulling the current from a 12V line.

      2. Elledan

        Re: Still not seeing the point of ATX12VO

        The selling point of ATX12VO is not the 12VO part, but the high efficiency during idle and standby. After doing research on this topic for an article, it's quite clear to me that this is achieved mostly due to the higher demands for low-power (idle) efficiency. And thus with the new ATX 3.0 spec these new PSUs should achieve similar efficiency levels.

        As for the 3.3/5V rails, these are very useful for devices like HDDs, SATA SSDs and other peripheral devices. Having the mainboard provide these rails instead of the PSU rather defeats the purpose and increases the cost and complexity of mainboards, which are already skyrocketing in price.

        With ATX 2.0, PSUs have already switched to providing mostly 12V, with these lower voltage rails being generated generally from a 12V source within the PSU (group-regulated or VRMs), making modern ATX 2.0 PSUs rather similar to what's proposed for ATX12VO, with the added convenience that mainboards don't have to assume part of the PSU's functionality.

        1. Boothy

          Re: Still not seeing the point of ATX12VO

          Worth mentioning that regular SATA HDDs and SSDs don't use the 3.3 volt line.

          Most early systems used MOLEX to SATA cables, which only provide 5v and 12v, so no 3.3v, so all early devices needed to work without 3.3v.

          Most PSUs today still include some MOLEX cables (for things like case fans), and usually include MOLEX to SATA converters in the box, in case you need them for more drives, so current HDD and SATA drive manufacturers still don't use the 3.3v supply, presumably for compatibility reasons.

          I have heard some mSATA devices do use 3.3 volt, but this is only an issue if you use some sort of converter from mSATA to a regular SATA, such as repurposing an old laptop drive, into an external caddy etc. i.e. You'd need 3.3volts then.

          The few ATX12VO boards I've looked at, have SATA PSU outputs on the motherboard, which has been a square 4 pin connector, 2 x ground, 1 x 12v and 1 x 5v. For these boards, I assume the 5v regulator is mounted on the board to drop from the 12v incoming feed.

    2. This post has been deleted by its author

  3. Anonymous Coward
    Anonymous Coward

    How about they keep the spec lower than 600 and make GPU makers create something more efficient? It is easy (see Apple) to knock off 100s W and get nearly the same performance, yet they can’t be arsed.

    1. Anonymous Coward
      Anonymous Coward

      Apple doesn't make cutting-edge GPU chips. Why don't you ask nVidia and AMD how close to bleeding they've been keeping things to keep computing demands fed without starting spontaneous fires in the process?

  4. Spoobistle
    Mushroom

    Hot new features

    I guess now optical drives are history, we can replace the cup-holder with a heated drawer for ready meals. Curry'n'Crysis anyone?

    1. David 132 Silver badge
      Pint

      Re: Hot new features

      >Curry'n'Crysis anyone?

      To go with the Netflix'n'Naan?

  5. heyrick Silver badge

    600W for a GPU?

    FFS.

    Isn't that something like fifty amps?

    1. Steve Todd

      Re: 600W for a GPU?

      It’s worse than that. It’s 50 amps at the PSU, but more like 500 amps at the GPU. It’s pretty much like a dead short electrically speaking.

      1. KSM-AZ

        Re: 600W for a GPU?

        I offset the down vote, why would you do that to provable fact? Makes no sense. I've often wondered, since amps determines wire gauge, how you step 12v @ 50a down to 1 or 2v at 200+ Amps and don't need double ought wire. . .

        P = E * I ... 600w=12v*I, I=600/12=50a ... 600w=1.2v*I, I=600/1.2=500a

        "Current" (ugh) chips generally run around 1.2v so that is a lot of electrons generating heat, in a space the size of a pinky nail.

    2. joed

      Re: 600W for a GPU?

      It's a dead end with efficiencies barely acceptable for a space heater in polar regions. Definitely the computing segment looking for a revolution and very similar to car market where almost all efficiency gains have been jeopardized due to never ending increase in mass of vehicles. Some governments (EU) are forcing impossible standards for some industries (with diminishing returns and failure prone solutions) while losing sight of growing powers hogs (data center etc). And no amount of greenwashing will help (all the talk of data centers using "renewable sources" means that other demand had to shift to dirty ones).

    3. Boothy

      Re: 600W for a GPU?

      I'm assuming the 600W is for a bit of future proofing.

      The current stock Nvidia 3090 consumes around 350W, with the new 3090 Ti variant being around 450W.

      AMD use a more efficient TSMC n7 node (Nvidia uses a less efficient Samsung node), so their top end stock 6900XT is around 300W.

      Note: Many vendor cards are OCd, so go even higher still!

      The existing standard 6 pin GPU connectors only deliver 75W, and the 8 pin 150W, so many even mid range cards need two, or even three for the top end, power connectors!

      Even the stock 3060 mid range card has both a 6 and an 8 pin connector, as it consumes 170W, so too much for a single 8 pin.

      So a single 600W connector, would simplify things a bit, as all current cards could use just a single connector, and this 600W still leaves some headroom.

      Would still be nice if they could drop the power requirements though, as you still need to deal with all the heat generated by these cards!

      1. batfink

        Re: 600W for a GPU?

        I'm just going to leave my case open and switch off my central heating.

        1. Boothy

          Re: 600W for a GPU?

          Chuckle.

          In Winter, I rarely need the heating on if I'm gaming. (It's actually on, just the thermostat rarely triggers).

          I've even been known to open a window to let some air through, as the rooms hit 25c+ whilst it's zero outside.

          In summer I don't game much day time, and even some nights I can't get the room below 30c even with all windows open!

          And this is in the UK, so no aircon. Seriously considering getting a portable aircon unit this year, especially as I work from home full time now!

      2. Charles 9

        Re: 600W for a GPU?

        But given how small the chip lithography is getting and how much we're still demanding of them, you're going to hit some physical limitations, at which point we're just gonna have to bear the costs, go without, or await some radical leap in computer technology (still waiting on photonics).

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like