• Rythm@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    17 hours ago

    Indeed, but in my country the psychological support is even mandatory. Furthermore, I know there have been pilots with using ML to go through the videos. When the system detects explicit material, an officer has to confirm it. But it prevents them going through it all day every day for each video. I think Microsoft has also been working on a database with hashes that LEO provides to automatically detect materials that have already been identified. All in all, a gruesome job, but fortunately technique is alleviating the harshest activities bit by bit.

    • ilega_dh@feddit.nl
      link
      fedilink
      English
      arrow-up
      6
      ·
      17 hours ago

      And this is for law enforcement level of personnel. Meta and friends just outsource content moderation to low-wage countries and let the poors deal with the PTSD themselves.

      Let’s hope that’s what AI can help with, instead of techbrocracy