Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds. Researchers found wild fluctuations—called drift—in the technology’s abi…::ChatGPT went from answering a simple math correctly 98% of the time to just 2%, over the course of a few months.

  • TimewornTraveler@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    so it confidently spews a bunch of incorrect shit, acts humble and apologetic while correcting none of its behavior, and constantly offers unsolicited advice.

    I think it trained on Reddit data

    • cxx@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      acts humble and apologetic

      We must be using different Reddits, my friend