YouTube respond to creator concerns over “adpocalypse,” sort of

YouTube advertisers

Update, November 29: YouTube have made a more specific statement regarding the effects their content moderation has had on creators.

Another wave of advertisers left YouTube earlier this week as a result of concerns over their sponsorships being placed on videos that were exploiting children. In response to this and similar previous controversies, YouTube have expanded their content monitoring systems, but this has resulted in many legitimate creators suffering false flags and demonetization without clear reasons why.

For their part, YouTube places the blame for the current state of affairs squarely at the feet of “bad actors” clogging up the service with abusive videos.

“Our community of creators are currently being hurt by bad actors who are spamming our systems with videos masquerading as family content,” a YouTube representative tells Polygon. “In order to protect creators and advertisers alike, we're taking aggressive action using a combination of machine learning and people to take action on this content through age-gating, demonetization and even the removal of channels where necessary.”

YouTube say they’re working to protect profits for legitimate creators, but they’re not really clear on how they plan to do that beyond the existing appeal process. “Our goal is to ultimately to protect the revenue of creators across the platform by taking these necessary actions.”

Original Story, November 28: YouTubers have been faced with increasingly confusing and unclear guidelines about what content is considered “appropriate” for the platform in the wake of growing concern from advertisers. Gaming content in particular has suffered in multiple ways, with advertisers wary of both the often violent content of games and inciting comments made by big stars like PewDiePie.

The vast amount of content on YouTube has meant that it’s nearly impossible to police offending videos by hand, leaving gaming content creators - many of whom work full time on their channels - with their livelihoods tied to the whims of the system’s algorithms determining what’s advertiser-friendly. Those algorithms are rarely clear.

But YouTube and Google find themselves incentivized to aggressively keep advertisers away from unfriendly content, especially when that content appears to aimed at children. The Guardian reports that brands like Mars, Cadbury, Lidl, Deutsche Bank, and Adidas have all suspended advertising with YouTube in the wake of increasing concern over predatory video content.

That predatory content largely comes in two forms. The first takes advantage of typical SEO exploits to move kids toward bizarre and disturbing content, presumably for some combination of advertising revenue and sick fun at exposing people to violent videos of Disney character and off-kilter nursery rhymes. The other type of video, as BBC report, is typically produced by children themselves, showcasing outfits and such in an emulation of fashion YouTubers. The comment sections on these often become the realm of pedophiles, and it’s an issue not just isolated to a handful of videos.

The result of all this is that YouTube is now pushed to aggressively deploy their algorithms in marking content as advertiser-unfriendly, as understandably no brand wants to be associated with content abusive to children. Last week, YouTube offered up an affirmation of their stance against that content, and what they plan to do to get rid of it. But while getting exploitive videos and pedophiles off of the service is a noble goal, the reality is that YouTube is too massive to be policed by hand, and an aggressive policy towards bad content means that legitimate creators are going to keep getting caught in the crossfire.

Paladins
Sign in to Commentlogin to comment
The Chimpy Man avatarCursedNaruto avatarAever avatar
The Chimpy Man Avatar
32
1 Month ago

Well, that's just what happens when you go all-in on piece of crap algorithms instead of using a little of that fortune to hire a couple of remotely savvy sets of working human eyes. Not like this was warned about several years ago or anything.

3
Aever Avatar
637
1 Month ago

Well, if this goes then no one will produce content, so no one will watch the highly "safe" ads anymore, so no one will pay for them.

Honestly, just require channels to go through a "certification" process and once certified stop removing stuff arbitrarily while your AI "learns". I mean if a channel has been online since forever, has thousands of subscribers and has always posted "safe" content ... why the fuck are you messing with it when the crap you're trying to remove is usually posted by "channels" that were created yesterday?

A lot of the youtubers that I used to watch have moved to twitch now, so keep it up Google, until you manage to bury the platform.

1
CursedNaruto Avatar
21
1 Month ago

Ha haha hahahaha I knew my thoughts on YouTube were write only part of YouTube that's nice is amv, music videos, when people upload old shows that the people that own them don't care that we'd like to watch them again and gameplay with out commentary so I can judge the game on value bit others opinions

0