Elon Musk’s X is now scanning videos and GIFs uploaded to the platform for child sexual abuse materials (CSAM) as part of wider efforts to crack down on child sexual exploitation online.
“At X, we have zero tolerance for child sexual exploitation, and we are determined to make X inhospitable for actors who seek to exploit minors in any way,” the company said.
The platform’s safety team also noted it has “strengthened enforcement and refined detection mechanisms” throughout the year as part of efforts to crack down on child sexual abuse materials on the site, and has increased the number of referrals sent to the National Center for Missing and Exploited Children (NCMEC).
-
Learn the TRUTH about Gold IRAs and how most precious metals companies play dirty.