Two big advertisers, AT&T and Hasbro are pulling ads from YouTube, on account of reports that pedophiles are accessing videos of young children and lurking in the comments section. Earlier, Nestle and “Fortnite” maker Epic Games had pulled some advertising and Disney has also reportedly paused its ads on Google. They will not place ads in YouTube until Google can protect its brand from offensive content, and inform all advertisers about the steps it is taking for that. Replying to that, YouTube has reportedly sent a memo to advertisers outlining changes it’s making to help protect brands
This is the aftermath of reports that ads in Google’s YouTube have been targeted by pedophiles are accessing videos of young children, often girls, marking time stamps that show child nudity and objectifying the children in the comments section of YouTube.
Earlier AT&T had pulled its entire ad spend from YouTube in 2017, after reports that its ads were appearing alongside offensive content, including terrorist content. After assurances, it resumed advertising in January.
Earlier advertisers such as Grammarly and Peloton, whose ads were placed alongside the videos, have been reported by CNBC to be in conversations with YouTube to resolve the issue.
YouTube declined to comment on any specific customer brands, but issues a statement on Wednesday, “Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments.”
YouTube also confirmed that it is suspending comments on millions of videos that “could be subject to predatory comments.” It also claimed to be making it harder for “innocent content to attract bad actors” through changes in discovery algorithms. It would make sure ads aren’t appearing on videos that could attract this sort of behavior, and removing accounts “that belonged to bad actors.” YouTube also confirmed that it is alerting authorities as and where needed.
Following this statement, YouTube terminated more than 400 channels and tens of millions of comments that were identified to be ‘bad actors’. The entire YouTube team has been shaken up in the wake of allegations of abusive content appearing alongside ads on its channels, according to the details that are coming from YouTube’s creator outreach team in response to a video from commentator Philip DeFranco published yesterday evening. The team’s statement said that “all of us at YouTube” are working on the problem, and that “we are continuing to grow our team in order to keep people safe.” YouTube has also been reporting comments and accounts to law enforcement, in compliance with federal law.
Advertisers and creators have responded to the issue over the past couple of days following a video highlighting the issue that gained widespread attention.
While YouTube is facing widespread criticism, other creators have come out in defence of the team. DeFranco said in his video that “this is something that YouTube has been consistently fighting” over the years. The company instituted community guidelines to specifically address predatory behavior on videos featuring children in 2017, and it’s given advertisers more control over where their ads get placed. Clearly this isn’t necessarily a YouTube problem alone, DeFranco said, it’s more of an issue with the current online landscape.