A shocking report has emerged after joint investigation from the Wall Street Journal (WSJ) and academics from the Stanford University and the University of Massachusetts Amherst which shows that the Instagram recommendation algorithm promoted and boosted pedophile networks who sell child sexual abuse content on the platform.

Instagram allowed users to search by hashtags related to child-sex abuse, including graphic terms such as #pedowhore, #preteensex, #pedobait and #mnsfw — the latter an acronym meaning “minors not safe for work,” researchers at Stanford University and the University of Massachusetts Amherst told the Wall Street Journal.

According to a report in The Wall Street Journal, the photo-sharing platform is “openly devoted to the commission and purchase of underage-sex content”.

Elon Musk has also tweeted about this report, calling it “extremely concerning”.

In response to this report, Meta has said that it is setting up an internal task force to investigate and address the issues raise in the report. “Child exploitation is a horrific crime. We’re continuously investigating ways to actively defend against this behaviour”, added the company.

It is also alleged in the report that in spite of users reporting posts and accounts sharing child abuse content or suspect content, the platform’s review team cleared the content or showed “Because of the high volume of reports we receive, our team hasn’t been able to review this post.”

The report also looked at other platforms but found them less amenable to growing such networks.

According to the WSJ, the Stanford investigators found “128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram” despite Twitter having far fewer users, and that such content “does not appear to proliferate” on TikTok. The report noted that Snapchat did not actively promote such networks as it’s mainly used for direct messaging.