A joint investigation was conducted by the Wall Street Journal and academics at Stanford University and the University of Massachusetts Amherst, and the report uncovered that Instagram's algorithm is commissioning and selling CSAM (child sexual abuse material).
Instagram’s recommendation systems “connects pedophiles and guides them to content sellers,” per the report. Researchers found accounts blatantly advertising these images to purchase or commission, using explicit hashtags catered to pedophiles. These accounts offer “menus” of content for users, including videos and imagery of self-harm and bestiality.
The investigation found that complaints and reporting of Instagram accounts were ignored, the social media platform's moderation practices severely lacking. The WSJ and academics even experienced rejected reports of CSAM. Meta (formerly the Facebook Company) said they are creating an internal task force to address the issues brought to light by this investigation.
“Child exploitation is a horrific crime,” the company said. “We’re continuously investigating ways to actively defend against this behavior.”
Comments