Not a good look for Mastodon - what can be done to automate the removal of CSAM?

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      What’s the point of reporting it to authorities? It’s not illegal, nor should it be because there’s no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.

      • balls_expert@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        It’s illegal in a lot of places including where I live.

        In the US you have the protect act of 2003

        (a) In General.—Any person who, in a circumstance described in subsection (d), knowingly produces, distributes, receives, or possesses with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that— (1) (A) depicts a minor engaging in sexually explicit conduct; and (B) is obscene; or (2) (A) depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; and (B) lacks serious literary, artistic, political, or scientific value; or attempts or conspires to do so, shall be subject to the penalties provided in section 2252A(b)(1), including the penalties provided for cases involving a prior conviction.

        Linked to the obscenity doctrine

        https://www.law.cornell.edu/uscode/text/18/1466A

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Wow, that’s absolutely ridiculous, thanks for sharing! That would be a very unpopular bill to get overturned…

          I guess it fits with the rest of the stupidly named bills. It doesn’t protect anything, it just prosecutes undesirable behaviors.

    • priapus@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

      • balls_expert@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        You can click the wikipedia link and check its sources to see that CSAM definitions indeed include illustrated and simulated forms

        They don’t actually need a victim to be defined as such

        • priapus@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

          • balls_expert@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            I assumed it was the same thing, but if you’re placing the bar of acceptable content below child porn, I don’t know what to tell you.

            • priapus@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              That’s not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.