• FreedomAdvocate@lemmy.net.au
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    5
    ·
    6 hours ago

    Your very first statement calling my basis for my argument incorrect is incorrect lol.

    LLMs “learn” things from the content they consume. They don’t just take the content in wholesale and keep it there to regurgitate on command.

    On your last part, unless someone uses AI to recreate the tone etc of a best selling author *and then markets their book/writing as being from said best selling author, and doesn’t use trademarked characters etc, there’s no issue. You can’t copyright a style of writing.

    • elrik@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I’ll repeat what you said with emphasis:

      AI can “learn” from and “read” a book in the same way a person can and does

      The emphasized part is incorrect. It’s not the same, yet your argument seems to be that because (your claim) it is the same, then it’s no different from a human reading all of these books.

      Regarding your last point, copyright law doesn’t just kick in because you try to pass something off as an original (by, for ex, marketing a book as being from a best selling author). It applies based on similarity whether you mention the original author or not.

    • WraithGear@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      edit-2
      5 hours ago

      If what you are saying is true, why were these ‘AI’s” incapable of rendering a full wine glass? It ‘knows’ the concept of a full glass of water, but because of humanities social pressures, a full wine glass being the epitome of gluttony, art work did not depict a full wine glass, no matter how ai prompters demanded, it was unable to link the concepts until it was literally created for it to regurgitate it out. It seems ‘AI’ doesn’t really learn, but regurgitates art out in collages of taken assets, smoothed over at the seams.

        • WraithGear@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 minutes ago

          “it was unable to link the concepts until it was literally created for it to regurgitate it out“

          -WraithGear

          The’ problem was solved before their patch. But the article just said that the model is changed by running it through a post check. Just like what deep seek does. It does not talk about the fundamental flaw in how it creates, they assert if does, like they always did

        • WraithGear@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 hours ago

          1 it’s not full, but closer then it was.

          1. I specifically said that the AI was unable to do it until someone specifically made a reference so that it could start passing the test so it’s a little bit late to prove much.
          • alsimoneau@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            33 minutes ago

            The concept of a glass being full and of a liquid being wine can probably be separated fairly well. I assume that as models got more complex they started being able to do this more.

          • alsimoneau@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            35 minutes ago

            If someone ask for a glass of water you don’t fill it all the way to the edge. This is way overfull compared to what you’re supposed to serve.