YouTube on Thursday cracked down on pedophilic content on the site by purging comments on tens of millions of videos, which serve as a kind of powerful but overlooked social network.
A video blogger this week published a report on YouTube documenting how comments and recommendations on the platform direct users to potentially sexual videos of children, allowing them to participate in a “soft-core pedophile ring,” according to the report. YouTube also terminated more than 400 channels on Thursday that posted the comments on videos featuring children.
Several major brands, such as Disney and Nestlé, this week halted their advertising on YouTube because their ads were played alongside videos with abusive or sexually explicit comments — a repeat of a brand boycott a couple of years ago when advertisers protested the placement of their spots in inappropriate videos.
YouTube’s latest controversy focuses on the abusive aspect of its comments section.
“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” YouTube spokeswoman Andrea Faville said in a statement Thursday. “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
In a video that has been viewed nearly 2 million times since its release Sunday, video blogger Matt Watson detailed how users who visit YouTube for bikini shopping videos can eventually be nudged to watch videos featuring young girls. After clicking on several bikini videos, YouTube’s recommendation engine suggests that users watch videos with minors, Watson said. The videos are not sexual in nature — they involve children talking to the camera, performing gymnastics or playing with toys, but they are interpreted by users in inappropriate ways. The comments on the videos include hyperlinked time stamps, Watson said, allowing users to jump to moments when the girls are in compromised positions; in other instances, users posted sexually explicit comments about the children.
“Once you are in this loophole, there is nothing but more videos of little girls,” he said in the video. “How has YouTube not seen this?“
YouTube also said it removed dozens of videos that were posted without malicious intent but were nonetheless putting children at risk. The company added it continues to invest in technology that allows it and its industry partners to detect and remove sexually abusive imagery.
In a company blog post from 2017, YouTube outlined the ways it was “toughening” its approach to protect families on its platform. One aspect of its approach was blocking inappropriate comments on videos featuring minors. The company said it had historically used a combination of automated systems and of people flagging inappropriate and predatory comments for review and removal. YouTube said at the time that it would take a more “aggressive stance” on curbing abusive posts by turning off the commenting feature when it detected such posts. It is technically easier for software to scan text, such as comments, rather than video for anything that would violate YouTube’s policies.
Since the latest controversy, YouTube said that it has been hiring more experts dedicated to child safety on the platform, and to identifying users who wish to harm children.
YouTube has previously grappled with publishing exploitative videos of children. In 2017, the company cracked down on accounts that posted disturbing videos for young audiences that featured children in predatory or compromising situations that drew massive audiences.
“YouTube, in addition to other social media platforms, should offer regular, independent, external audits of online hate and harassment,” said George Selim, the senior vice president of the Anti-Defamation League.
[YouTube excels at recommending videos — but not at detecting hoaxes]
Watson said that some of the YouTube videos feature ads for big-name companies, including Disney.
Nestlé said, “An extremely low volume of some of our advertisements were shown on videos on YouTube where inappropriate comments were being made,” adding that it is investigating the matter with YouTube and its partners and has decided to pause its advertising on the platform globally.
Disney has also suspended its advertising on the YouTube, according to Bloomberg. Disney did not immediately respond to a request for comment.
“Fortnite” maker Epic Games said it has “paused” its advertising on YouTube that runs before videos, but it’s unclear whether Epic’s ads appeared in the controversial content. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service,” Epic said in a statement.