Algorithms on the social media platform Instagram, owned by Facebook parent company Meta Platforms, reportedly serve to connect pedophiles with purveyors of child sexual abuse material (CSAM), which essentially use the platform to offer a “menu” of children to perverts.
A new report from the Wall Street Journal claims that Instagram “helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content,” citing an investigation mounted by the publication alongside researchers at Stanford University and the University of Massachusetts Amherst.
Instagram is accused of hosting accounts that use hashtags including “#pedowhore” and “#preteensex” that allow interested users to tap into the “niche” community of pedophiles and CSAM sellers:
Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you.”
The report explains that Instagram users selling CSAM “generally don’t publish it openly, instead posting ‘menus’ of content.” Some accounts claiming to be run by children invite Instagram users “to commission specific acts.” Worse, “Some menus include prices for children harming themselves and ‘imagery of the minor performing sexual acts with animals.'” These materials are both against Meta’s platform rules and against federal law.
BOMBSHELL REPORT: The Wall Street Journal reports that researchers found that Instagram enabled people to connect them to accounts that used terms to advertise child-sex material for sale and that pedophilic accounts use the cheese pizza emoji as pedophile code and some of these… pic.twitter.com/VH8Fh3SRzA
— LIZ CROKIN (@LizCrokin) June 7, 2023
Though the hashtags and braggadocios marketing may suggest otherwise to some, Meta claims it is “continuously investigating ways to actively defend against this behavior” and claimed to have taken down 27 separate pedophile networks in the past two years.
In response to the report, WSJ reported that Meta “said it has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse” and is working to prevent its algorithms from recommending such content in the future.
This is not the first time Meta has been connected with CSAM.
In 2021, Meta faced harsh headlines when Facebook – the largest social media website in the West – was revealed to be host to the “vast majority” of online child exploitation reports according to data from the National Center for Missing and Exploited Children.
