Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • teh7077@lemmy.today
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    edit-2
    21 hours ago

    That’s what I always thought when reading this and other articles about the estimated power consumption of GPT-4. Run a decent 7B LLM on consumer hardware like the steam deck and you got your e-mail in a minute with the fans barely spinning up.

    Then I read that GPT-4 is supposedly a 1760B model. (https://en.m.wikipedia.org/wiki/GPT-4#Background) I don’t know how energy usage would scale with model size exactly, but I’d consider it plausible that we are talking orders of magnitude above the typical local LLM.

    considering that the email by the local LLM will be good enough 99% of the time, GPT may just be horribly inefficient, in order to score higher in some synthetic benchmarks?

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      24 hours ago

      Computational demands scale aggressively with model size.

      And if you want a response back in a reasonable amount of time you’re burning a ton of power to do so. These models are not fast at all.

      • teh7077@lemmy.today
        link
        fedilink
        English
        arrow-up
        15
        ·
        23 hours ago

        Thanks for confirming my suspicion.

        So, the whole debate about “environmental impact of AI” is not about generative AI as such at all. Really comes down to people using disproportionally large models for simple tasks that could be done just as well by smaller ones, run locally. Or worse yet, asking a behemoth model like GPT-4 about something that could and should have been a simple search engine query, which I (subjectively) feel has become a trend in everyday tech usage…