We may earn a commission when you buy through links in our articles. Learn more.

YouTube respond to creator concerns over “adpocalypse,” sort of

YouTube

Update, November 29: YouTube have made a more specific statement regarding the effects their content moderation has had on creators.

Another wave of advertisers left YouTube earlier this week as a result of concerns over their sponsorships being placed on videos that were exploiting children. In response to this and similar previous controversies, YouTube have expanded their content monitoring systems, but this has resulted in many legitimate creators suffering false flags and demonetization without clear reasons why.

For their part, YouTube places the blame for the current state of affairs squarely at the feet of “bad actors” clogging up the service with abusive videos.

“Our community of creators are currently being hurt by bad actors who are spamming our systems with videos masquerading as family content,” a YouTube representative tells Polygon. “In order to protect creators and advertisers alike, we’re taking aggressive action using a combination of machine learning and people to take action on this content through age-gating, demonetization and even the removal of channels where necessary.”

YouTube say they’re working to protect profits for legitimate creators, but they’re not really clear on how they plan to do that beyond the existing appeal process. “Our goal is to ultimately to protect the revenue of creators across the platform by taking these necessary actions.”

Original Story, November 28: YouTubers have been faced with increasingly confusing and unclear guidelines about what content is considered “appropriate” for the platform in the wake of growing concern from advertisers. Gaming content in particular has suffered in multiple ways, with advertisers wary of both the often violent content of games and inciting comments made by big stars likePewDiePie.

The vast amount of content on YouTube has meant that it’s nearly impossible to police offending videos by hand, leaving gaming content creators – many of whom work full time on their channels – with their livelihoods tied to the whims of the system’s algorithms determining what’s advertiser-friendly. Those algorithms are rarely clear.

But YouTube and Google find themselves incentivized to aggressively keep advertisers away from unfriendly content, especially when that content appears to aimed at children. The Guardian reports that brands like Mars, Cadbury, Lidl, Deutsche Bank, and Adidas have all suspended advertising with YouTube in the wake of increasing concern over predatory video content.

That predatory content largely comes in two forms. The first takes advantage of typical SEO exploits to move kids toward bizarre and disturbing content, presumably for some combination of advertising revenue and sick fun at exposing people to violent videos of Disney character and off-kilter nursery rhymes. The other type of video, as BBC report, is typically produced by children themselves, showcasing outfits and such in an emulation of fashion YouTubers. The comment sections on these often become the realm of pedophiles, and it’s an issue not just isolated to a handful of videos.

The result of all this is that YouTube is now pushed to aggressively deploy their algorithms in marking content as advertiser-unfriendly, as understandably no brand wants to be associated with content abusive to children. Last week, YouTube offered up an affirmation of their stance against that content, and what they plan to do to get rid of it. But while getting exploitive videos and pedophiles off of the service is a noble goal, the reality is that YouTube is too massive to be policed by hand, and an aggressive policy towards bad content means that legitimate creators are going to keep getting caught in the crossfire.