ChatGPT is full of sensitive private information and spits out verbatim text from CNN, Goodreads, WordPress blogs, fandom wikis, Terms of Service agreements, Stack Overflow source code, Wikipedia pages, news blogs, random internet comments, and much more.

Using this tactic, the researchers showed that there are large amounts of privately identifiable information (PII) in OpenAI’s large language models. They also showed that, on a public version of ChatGPT, the chatbot spit out large passages of text scraped verbatim from other places on the internet.

“In total, 16.9 percent of generations we tested contained memorized PII,” they wrote, which included “identifying phone and fax numbers, email and physical addresses … social media handles, URLs, and names and birthdays.”

Edit: The full paper that’s referenced in the article can be found here

  • Chozo@kbin.social
    link
    fedilink
    arrow-up
    30
    arrow-down
    2
    ·
    1 year ago

    I’d have to imagine that this PII was made publicly-available in order for GPT to have scraped it.

      • Chozo@kbin.social
        link
        fedilink
        arrow-up
        6
        arrow-down
        16
        ·
        1 year ago

        It also doesn’t mean it inherently isn’t free to use, either. The article doesn’t say whether or not the PII in question was intended to be private or public.

        • Davel23@kbin.social
          link
          fedilink
          arrow-up
          25
          arrow-down
          3
          ·
          1 year ago

          I could leave my car with the keys in the ignition in the bad part of town. It’s still not legal to steal it.

          • Chozo@kbin.social
            link
            fedilink
            arrow-up
            11
            arrow-down
            2
            ·
            1 year ago

            Again, the article doesn’t say whether or not the data was intended to be public. People post their contact info online on purpose sometimes, you know. Businesses and shit. Which seems most likely to be what’s happened, given that the example has a fax number.

          • Dran@lemmy.world
            link
            fedilink
            arrow-up
            5
            arrow-down
            5
            ·
            1 year ago

            If someone had some theoretical device that could x-ray, 3d image, and 3d print an exact replica of your car though, that would be legal. That’s a closer analogy.

            It’s not illegal to reverse-engineer and reproduce for personal use. It is questionably legal though to sell the reproduction. However, if the car were open-source or otherwise not copyrighted/patented it probably would be legal to sell the reproduction.

        • RenardDesMers@lemmy.ml
          link
          fedilink
          arrow-up
          25
          ·
          1 year ago

          According to EU law, PII should be accessible, modifiable and deletable by the targeted persons. I don’t think ChatGPT would allow me to delete information about me found in their training data.

          • Touching_Grass@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            15
            ·
            edit-2
            1 year ago

            ban all European IPS from using these applications

            But again, is this your information as in its random individuals or is this really some company roster listing CEOs it grabbed off some third party website that none of us are actually on and its being passed off as if its regular folks information

            • Catoblepas@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              11
              ·
              1 year ago

              “Just ban everyone from places with legal protections” is a hilarious solution to a PII-spitting machine, thanks for the laugh.

              • Touching_Grass@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                9
                ·
                edit-2
                1 year ago

                You’re pretentiously laughing at region locking. That’s been around for a while. You can’t untrain these AI. This PII which has always been publicly available and seems to be an issue only now is not something they can pull out and retrain. So if its that big an issue, region lock them. Fuck em. But again this doesn’t sound like Joe blow has information available. It seems more like websites that are scraping company details which these ai then scrape.

    • Skull giver@popplesburger.hilciferous.nl
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      1 year ago

      The source of the PII rarely matters when you don’t have any explicit permission to gather it. If someone exercises their legal right to demand correction (i.e. a name change or misattribution, but possibly also demands for takedown) we may see some pretty weird lawsuits and fines.

      The Belgian ING Bank was ordered by the court to alter their old COBOL systems after a correction demand was issued by a customer whose accented name didn’t appear correctly. I imagine ChatGPT may find itself in a similar “I don’t care how you comply with the law, you should’ve figured that out years ago” situation.

    • Touching_Grass@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      large amounts of privately identifiable information (PII)

      Yea the wording is kind of ambiguous. Are they saying it’s a private phone number or the number of a ted and sons plumbing and heating