We may earn a commission when you buy through links in our articles. Learn more.

Epic pulls Fortnite ads while waiting for YouTube to deal with child exploitation

Advertisers are scattering as YouTube struggles to deal with a major problem

fortnite ice storm challenges

February 20, 2019 In the wake of controversy over how YouTube is handling child exploitation, Epic has pulled Fortnite ads from the site.

YouTube faces increasing pressure from advertisers following a widespread claim that its algorithmic video recommendations are helping to bring together child predators and spread content that sexualises minors. It’s a new wave of the same sort of concern which came around the ‘adpocalypse,’ and now Fortnite publisher Epic has pulled advertisements from YouTube while it waits for the site to take action on this content.

“We have paused all pre-roll advertising,” Epic tells The Verge. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”

YouTube has been taking action, but it seems to have had more effect on ordinary content creators than predators. Pokémon Go and Club Penguin videos in particular were recently targeted for action because of what YouTube called “sexual content involving minors.” The issue seems to be an acronym: CP. None of the videos targeted appear to feature any sexual content – never mind with minors – but they all had that CP acronym.

That can be short for Club Penguin, or it can refer to Pokémon Go’s combat power, and it’s been used in other innocuous videos taken down by these ban waves. As Newsweek reports, YouTube’s algorithms seem to have taken the acronym to mean child porn.

In some cases, that’s meant total Google account takedowns, leaving those affected unable to access their Gmail accounts to figure out what was happening. Those incorrectly banned were all reinstated within 24 hours, but many remain concerned.

Affected YouTuber Vailskibum94, for example, tweets that “the fact that an entire channel can be deleted over a single Club Penguin video is absolutely insane, and this platform desperately needs changes to avoid this from happening again.”

This is all happening as YouTuber MattsWhatItIs has uploaded a video titled ‘Youtube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized.’ In that video, which now has over 175,000 upvotes on Reddit, Matt claims that YouTube’s algorithms are facilitating a “soft-core pedophilia ring” on the platform.

In short, the allegation is that YouTube’s recommendation engine will point users through a wormhole filled with videos of minors in compromising positions. While many of those videos are innocuous in and of themselves, their connection with other videos of the same type are allowing pedophiles to contact one another and share these videos, as well as more explicit content.

“Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” a representative tells Newsweek. “We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue.”

Still, many are unhappy with the methods YouTube has employed – both for taking immediate action against false positives and failing to adequately address the offending content.