Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • Jordan Lund@lemmy.one
    link
    fedilink
    arrow-up
    29
    ·
    1 year ago

    “massive child abuse material problem”

    “112 instances of known CSAM across 325,000 posts”

    While any instance is unacceptable, does 112/325,000 constitute a “massive problem”?

    0.0000034462% of posts are unacceptable! Massive problem!

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      That’s just the material they knew was CSAM from previous investigations.

      There were also 713 uses of the top 20 CSAM-related hashtags across the Fediverse on posts that contained media, as well as 1,217 text-only posts that pointed to “off-site CSAM trading or grooming of minors.” The study notes that the open posting of CSAM is “disturbingly prevalent.”