back to article AI me to the Moon... Carbon footprint for 'training GPT-3' same as driving to our natural satellite and back

Training OpenAI’s giant GPT-3 text-generating model is akin to driving a car to the Moon and back, computer scientists reckon. More specifically, they estimated teaching the neural super-network in a Microsoft data center using Nvidia GPUs required roughly 190,000 kWh, which using the average carbon intensity of America would …

  1. Neil Barnes Silver badge

    So...

    Did they use an AI to calculate the effect?

  2. Jimmy2Cows Silver badge

    190000kWh...

    Sounds big until 126 homes in Denmark per year is shown as equivalent. At which point it becomes clear the energy used is a piffling rounding error when compared to overall world, or even city, consumption. Well, well... something with a lot of computers running full tilt consumes a significant amount of energy. Who knew?

    My question therefore is "In the grand scheme of things, so what?".

    The eggheads who produced this guesstimate are based at the University of Copenhagen in Denmark using seemingly big numbers in an effort to grab headlines, and stoke outrage among the hard-of-thinking.

    1. Harry Kiri

      Re: 190000kWh...

      The tragedy of the commons....

      190000kWh is a staggering amount of energy that doesn't produce anything tangible. If people using gpt-3 or any other ai are 'clever' then they should be responsible enough to account for the impact of what they're doing.

      I've read an estimate that data centres account for 3% of global emissions - the same as air travel. Yep there are error bars there but these things have a cost that needs some recognition.

      1. Jimmy2Cows Silver badge

        Re: Tragedy of the commons

        Hardly.

        190000kWh is a staggering amount of energy...

        It really isn't. Over a year i.e. the period by which emissions and energy consumption tend to be measured, that's 520Wh per day i.e. really not much at all. While I agree that GPT-3 may be of limited benefit, that doesn't detract from my original point.

        ...that doesn't produce anything tangible.

        If all scientific endeavour had to produce something tangible to be worth doing, we'd still be living in caves. GPT-3 and other such projects are attempting to advance the start of the art in AI. Personally I still think there's a long, long way to go before AI fulfills the "I" part, but that's irrelevant.

        I've read an estimate that data centres account for 3% of global emissions - the same as air travel

        And you acuse me of tragedy of the commons. You're conflating two entirely different energy uses, taking what's a currently a non-essential ecological demon (air travel) and comparing it with an essential underpinning of our daily lives. Apples to oranges.

        Yes, data centres may well account for 3% of global emissions (citation please), there are a lot of them, and most of our modern life is supported by those things. Probably you wouldn't be able to interact with el Reg were it not for data centres.

        Whether / how much our lives are improved by data centres, and our data-centric existence in general, is a different philosophical debate.

        1. porkfat

          Re: Tragedy of the commons

          I think you might have missed a few zeros, three in fact!

      2. Anonymous Coward
        Devil

        Re: 190000kWh...

        A bit in the bucket compared to Faecesbook which also produces nothing tangible.

  3. Anonymous Coward
    Boffin

    The surprising thing

    is that is is not that large. If it really is 198,000 kWh, then if a human takes about 66W to run (which I think is about right), this is about 330 years of a human. So in particular it's not millions of times more energy intensive than a human: it's 33 times worse if a human can learn a natural language pretty well by 10.

    Of course GPT-3 tells us fuck all about actual human language processing: to train it to produce language which might be plausible enough for some purposes but clearly is not generated by anything with any level of general intelligence took something like 10^12 words of training data. If a human learnt a language by being exposed to 5 words a second, 8 hours a day, for ten years (which is much, much more than they actually require), they are exposed to about 500 million words: about 2,000 times less (and in real life, humans are competent much earlier and see much less language.

    This is just the latest AI hype cycle: the next AI winter will come soon.

  4. Filippo Silver badge

    I bet that there are any number of industrial factories and processing plants that consume way, way more energy than that, every day, in order to produce items that humanity could easily do without.

    Even if we just limit ourselves to data processing, I'm certain that data centers consume vastly more energy on shifting cat pictures and porn, than on scientific purposes.

    And yet, who are we piling on here? The folks who make (and buy) tons of plastic toys that are basically the same as last year's tons of plastic toys in landfills? The folks who absolutely have to send a 4k video of their toddler playing with said plastic toys to everyone they know? No, we pile on academic researchers.

    Stick around: next up, we pile on myself, for whataboutism.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like