- An investigation published Wednesday described how Instagram's algorithms connect and promote accounts that facilitate and sell child sexual abuse material.
- Alex Stamos, one of the paper's authors, said researchers focused on Instagram because its "position as the most popular platform for teenagers globally makes it a critical part of this ecosystem."
- The investigation was conducted by The Wall Street Journal and researchers at Stanford University's Internet Observatory Cyber Policy Center and the University of Massachusetts Amherst.
Instagram's recommendation algorithms have been connecting and promoting accounts that facilitate and sell child sexual abuse content, according to an investigation published Wednesday.
Meta's photo-sharing service stands out from other social media platforms and "appears to have a particularly severe problem" with accounts showing self-generated child sexual abuse material, or SG-CSAM, Stanford University researchers wrote in an accompanying study. Such accounts purport to be operated by minors.
"Due to the widespread use of hashtags, relatively long life of seller accounts and, especially, the effective recommendation algorithm, Instagram serves as the key discovery mechanism for this specific community of buyers and sellers," according to the study, which was cited in the investigation by The Wall Street Journal, Stanford University's Internet Observatory Cyber Policy Center and the University of Massachusetts Amherst.
While the accounts could be found by any user searching for explicit hashtags, the researchers discovered Instagram's recommendation algorithms also promoted them "to users viewing an account in the network, allowing for account discovery without keyword searches."
A Meta spokesperson said in a statement that the company has been taking several steps to fix the issues and that it "set up an internal task force" to investigate and address these claims.
"Child exploitation is a horrific crime," the spokesperson said. "We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it."
Money Report
Alex Stamos, Facebook's former chief security officer and one of the paper's authors, said in a tweet Wednesday that the researchers focused on Instagram because its "position as the most popular platform for teenagers globally makes it a critical part of this ecosystem." However, he added "Twitter continues to have serious issues with child exploitation."
Stamos, who is now director of the Stanford Internet Observatory, said the problem has persisted after Elon Musk acquired Twitter late last year.
Feeling out of the loop? We'll catch you up on the Chicago news you need to know. Sign up for the weekly Chicago Catch-Up newsletter.
"What we found is that Twitter's basic scanning for known CSAM broke after Mr. Musk's takeover and was not fixed until we notified them," Stamos wrote.
"They then cut off our API access," he added, referring to the software that lets researchers access Twitter data to conduct their studies.
Earlier this year, NBC News reported multiple Twitter accounts that offer or sell CSAM have remained available for months, even after Musk pledged to address problems with child exploitation on the social messaging service.
Twitter didn't provide a comment for this story.
Watch: YouTube and Instagram would benefit most from a ban on TikTok