Info Pulse Now

AI trained on real child sex abuse images to detect new CSAM


AI trained on real child sex abuse images to detect new CSAM

For years, hashing technology has made it possible for platforms to automatically detect known child sexual abuse materials (CSAM) to stop kids from being retraumatized online. However, rapidly detecting new or unknown CSAM remained a bigger challenge for platforms as new victims continued to be victimized. Now, AI may be ready to change that.

Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It's the earliest AI technology striving to expose unreported CSAM at scale.

An expansion of Thorn's CSAM detection tool, Safer, the new "Predict" feature uses "advanced machine learning (ML) classification models" to "detect new or previously unreported CSAM and child sexual exploitation behavior (CSE), generating a risk score to make human decisions easier and faster."

The model was trained in part using data from the National Center for Missing and Exploited Children (NCMEC) CyberTipline, relying on real CSAM data to detect patterns in harmful images and videos. Once suspected CSAM is flagged, a human reviewer remains in the loop to ensure oversight. It could potentially be used to probe suspected CSAM rings proliferating online.

It could also, of course, make mistakes, but Kevin Guo, Hive's CEO, told Ars that extensive testing was conducted to reduce false positives or negatives substantially. While he wouldn't share stats, he said that platforms would not be interested in a tool where "99 out of a hundred things the tool is flagging aren't correct."

Rebecca Portnoff, Thorn's vice president of data science, told Ars that it was a "no-brainer" to partner with Hive on Safer. Hive provides content moderation models used by hundreds of popular online communities, and Guo told Ars that platforms have consistently asked for tools to detect unknown CSAM, much of which currently festers in blindspots online because the hashing database will never expose it.

Previous articleNext article

POPULAR CATEGORY

corporate

7064

tech

8182

entertainment

8940

research

4103

misc

9384

wellness

7141

athletics

9517