An international operation against child sexual exploitation, supported by Europol and led by the State Criminal Police of Bavaria (Bayerisches Landeskriminalamt) and the Bavarian Central Office for the Prosecution of Cybercrime (ZCB), has resulted in the shutdown of Kidflix, one of the largest paedophile platforms in the world.
During the investigation, Europol’s analysts from the European Cybercrime Centre (EC3) provided intensive operational support to national authorities by analysing thousands of videos.
I don’t know how you can do this job and not get sick because looking away is not an option
I’m sure many of them numb themselves to it, and pretend it isn’t real in order to do the job. Then unfortunately, I’m sure some of them get addicted themselves.
Similar to undercover cops who do drugs while undercover, then get addicted to the drugs.
Yes, my wife used to work in the ER, she still tells the same stories over and over again 15 years later, because the memories of the horrible shit she saw doesn’t go away
Indeed, but in my country the psychological support is even mandatory. Furthermore, I know there have been pilots with using ML to go through the videos. When the system detects explicit material, an officer has to confirm it. But it prevents them going through it all day every day for each video. I think Microsoft has also been working on a database with hashes that LEO provides to automatically detect materials that have already been identified. All in all, a gruesome job, but fortunately technique is alleviating the harshest activities bit by bit.
And this is for law enforcement level of personnel. Meta and friends just outsource content moderation to low-wage countries and let the poors deal with the PTSD themselves.
Let’s hope that’s what AI can help with, instead of techbrocracy
I don’t know how you can do this job and not get sick because looking away is not an option
This kind of shit is why i noped out of the digital forensics field. I would have killed myself if I had to see that shit everyday.
I’m sure many of them numb themselves to it, and pretend it isn’t real in order to do the job. Then unfortunately, I’m sure some of them get addicted themselves.
Similar to undercover cops who do drugs while undercover, then get addicted to the drugs.
You do get sick, and I would be most surprised if they didnt allow people to look away and take breaks/get support as needed.
Most emergency line operators and similar kinds of inspectors get them, so it would be odd if they did not.
Yes, my wife used to work in the ER, she still tells the same stories over and over again 15 years later, because the memories of the horrible shit she saw doesn’t go away
Indeed, but in my country the psychological support is even mandatory. Furthermore, I know there have been pilots with using ML to go through the videos. When the system detects explicit material, an officer has to confirm it. But it prevents them going through it all day every day for each video. I think Microsoft has also been working on a database with hashes that LEO provides to automatically detect materials that have already been identified. All in all, a gruesome job, but fortunately technique is alleviating the harshest activities bit by bit.
And this is for law enforcement level of personnel. Meta and friends just outsource content moderation to low-wage countries and let the poors deal with the PTSD themselves.
Let’s hope that’s what AI can help with, instead of techbrocracy