I hear people saying things like “chatgpt is basically just a fancy predictive text”. I’m certainly not in the “it’s sentient!” camp, but it seems pretty obvious that a lot more is going on than just predicting the most likely next word.

Even if it’s predicting word by word within a bunch of constraints & structures inferred from the question / prompt, then that’s pretty interesting. Tbh, I’m more impressed by chatgpt’s ability to appearing to “understand” my prompts than I am by the quality of the output. Even though it’s writing is generally a mix of bland, obvious and inaccurate, it mostly does provide a plausible response to whatever I’ve asked / said.

Anyone feel like providing an ELI5 explanation of how it works? Or any good links to articles / videos?

  • LesserAbe@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    10 months ago

    I think this explanation would be more satisfying if we had a better understanding of how the human brain produces intelligence.

    • SorteKanin@feddit.dk
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      I agree. We don’t actually know that the brain isn’t just doing the same thing as ChatGPT. It probably isn’t, but we don’t really know.

      • Dran@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        10 months ago

        Considering that we can train digital statistical models to read thoughts via brain scans I think it’s more likely than not that we are more similar