back to article Cloudy with a chance of GPU bills: AI's energy appetite has CIOs sweating

Organizations are being forced to rethink where they host workloads in response to ballooning AI demands combined with rising energy bills, and shoving them into the public cloud may not be the answer. CIOs are facing a quandary over rising power consumption from the huge compute demands of training and deploying advanced AI …

  1. Doctor Syntax Silver badge

    Make sure cost are charged to departments. If marketing, for example, wants to use AI which runs up a big power bill ensure it comes out of marketing's budget. If that then have less to splurge on ads or events, tough - they should have thought about that first. If they then have to explain the cost-benefits of the expenditure, so much the better.

  2. Will Godfrey Silver badge
    Coat

    De-cloud for starters

    The people used to work for went cloud, Apparently they've now moved back on-prem. It's faster, more reliable, and (surprise) it's cheaper.

  3. vtcodger Silver badge

    Just say no

    Letting others go off and squander time and money on a quite dubious "technology" while you stodgily continue to do whatever it is to do the way you've always done it, is a real alternative you know. True, in the best of all possible worlds, you'd invest a bunch of bucks and catch the AI wave and ride it into a glorious future. But isn't more likely that you'll misjudge this rather nebulous thing, get pounded into the bottom, and come up spitting salt water and wishing you'd never heard of AI. After all, if AI actually works out, you can probably buy AI as a service from somebody while you catch up avoiding a whole lot of pitfalls that others have found out about the hard way. More likely than not, you'll end up money ahead and save yourself a whole lot of grief.

  4. cyberdemon Silver badge
    Flame

    WTF is CIO?

    I had to google it.

    Apparently it means "IT Director"

    1. Mentat74
      Trollface

      Re: WTF is CIO?

      Chief IdiOt...

  5. Rich 2 Silver badge

    AI - yes, THAT again

    “… in response to ballooning AI demands…”

    Why does every story have to suggest that everyone is scrambling to use “AI”?

    Can we just stop this bollox please. It’s bloody infuriating

  6. Doctor Syntax Silver badge

    It doesn't say "everyone". It implies there are enough people to cause problems and problems get reported. If it infuriates you don't read it.

  7. vtcodger Silver badge

    Another Alternative

    Let's say you run IT for modest business endeavor. 500 desks plus another 50 PCs doing various weird and wonderful things that someone once thought would be a good idea like running a camera in the main lobby during off hours and saving the images ... and some of them actually are a good idea. Maybe even mission critical. Let's say that a routine desktop PC costs you $2000 while an AI capable PC will run $6000.

    Now then, you'll likely not upgrade the 50 special purpose boxes because you know that on too many of them new hardware will mean they won't work, and debugging the problems machine by machine is costly even though many of the problems will be trivial.

    That leaves the 500. Hardware costs alone for upgrading the 500 to AI capable will be 500*(6000-2000) = $2,000,000. That's a tidy sum probably not in your budget which likely assumes $2000 per desktop every three years -- $333,000 annually. Then there's the electricity cost. And likely a number of other costs. Yes, possibly you need to be concerned,

    But why not treat the AI things like those other expensive office devices/nuisances -- copiers, high volume printers, and such? Buy/lease one or two. Make them available on the office network. Monitor usage. See who's using them and for high volume users find out why. Maybe even give those folks an AI capable desktop if that's what they actually need. Given any luck at all, you may find that short term you only need a handful of AI capable devices. If that. Longer term? With any luck at all, you'll be retired and playing golf in some sunny spot long before everyone needs AI on their desktop.

    1. DoctorNine

      Re: Another Alternative

      Well that accounts nicely for how to put a limit on the cap-ex for machine upgrades, but still doesn't address the op-ex of the 'lectrical tax on them little buggers. It's not insignificant.

      1. Richard 12 Silver badge

        Re: Another Alternative

        You treat that the same as paper, toner, and ink on the special printers.

        Most staff and departments have no need of high quality photo printing, but maybe some do in your business.

        In that case, whichever department is using it gets charged for it out of their budget using a simple proxy - maybe charge a reasonable estimate of annual running capex, or something like "active hours" if usage is harder to predict.

        The same as the cost of printing a couple of reams of photo paper on the dye sublimation printer is charged by the page. Cost per page was never all that precise either.

      2. Jellied Eel Silver badge

        Re: Another Alternative

        ...but still doesn't address the op-ex of the 'lectrical tax on them little buggers.

        It's based on a bit of a straw-man..

        That leaves the 500. Hardware costs alone for upgrading the 500 to AI capable will be 500*(6000-2000) = $2,000,000.

        And assumes that all 500 general purpose PCs would need to be upgraded to being 'AI capable'. In most businesses, you'd probably neither need nor want that. Most business PCs probably just run browser based apps and thin-ish clients to servers. They probably don't need an 'AI capable' PC, especially as most sensible AI stuff would be server based.

        Then again, there may be an argument to upgrade PCs. This one has an Intel i9-12900KF and a pair of RTX4090s. It is currently running Firefox, and an industrial automation simulation*. CPU is running at around 15%, GPU <30%. According to Intel, CPU has base power consumption of 150W, max 241W, and PC is currently drawing around 250W, which includes 3x 4K monitors. So it's one of the reasons I figured I'd upgrade to something more than I needed.

        If the PC is running flat out, then yes, it uses a lot of electricity. But I've rarely seen the CPU go over 20% unless I'm running benchmarks. Conversely, I've been forced to use business's PCs that have the CPU maxed out, can't cope with multiple apps open at the same time. So then the trade-off between CPU & GPUs running closer to 100%, and just reduction in productivity with staff having to keep opening & closing apps to get work done. And then there's future proofing by having PCs that have enough grunt to cope with the next MS update, or even 'AI', if businesses ever find a use case for that.

        *Ok, Factorio. I'm almost in space!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like