• Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      These models are just a collection of observations in relation to each other. We need to be careful not weaken fair use and hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people to keep developing our own models. Mega corporations already have their own datasets, and the money to buy more. They can also make users sign predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us. Regular people, who could have had access to a corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off with fewer rights than where they started.

          • oomphaloompha@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            It does not and I specifically deleted my comment because I knew someone like you was going to repeat the same arguments you people keep repeating ad-nauseam. I’ve read all of it and you jumping into conclusions about me only says anything about you. There are problematic uses and there are issues that should be discussed and taken into consideration, but conflating a specific use for a tool to be the only use case to make a case for banning the entire tool is pretty shitty.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      The only thing this ruling really says is “AIs are not legal persons and copyright can only be held by legal persons.” Which is not particularly unexpected or useful when deciding other issues being raised by copyrighting AI outputs.