Last month,
YouTube star
Shane Dawson uploaded his new project: a 104-minute documentary called Conspiracy Theories With Shane Dawson. In the video, set to a spooky instrumental soundtrack, Dawson unspooled a series of far-fetched hypotheses. Among them: that iPhones secretly record their owners’ every utterance; that popular children’s TV shows contain subliminal messages urging children to commit suicide; that the recent string of deadly wildfires in California was set on purpose, either by homeowners looking to collect insurance money or by the military using a high-powered laser called “directed energy weapon.”
None of this was fact-based, of course, and some of the theories seemed more like jokey urban legends than serious accusations. Still, his fans ate it up. The video has got more than 30 million views, a hit even by Dawson’s standards. A follow-up has drawn more than 20 million views and started a public feud with Chuck E Cheese’s, the restaurant chain, which was forced to deny claims that it recycles customers’ uneaten pizza slices into new pizzas.
Dawson’s conspiracy series arrived at a particularly awkward moment for YouTube, which has been reckoning with the vast troves of misinformation and extreme content on its platform.
In late January, the company announced that it was changing its recommendations algorithm to reduce the spread of “borderline content and content that could misinform users in harmful ways.” It cited, as examples, “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat or making blatantly false claims about historic events like 9/11.”
Dawson, whose real name is Shane Lee Yaw, has more than 20 million subscribers and a devoted teenage fan base. He has built his lucrative career by, among other talents, understanding what kinds of content plays well on YouTube. For years, that meant conspiracy theories — lots and lots of them, all delivered with the same wide-eyed credulity. In a 2016 video, he wondered aloud if the first Apollo moon landing was staged by Nasa. (“It’s a theory,” he said, “but, I mean, all the evidence is not looking good.”) In 2017, he discussed the false theory that the attacks of September 11, 2001, were a hoax. (“I know it’s crazy,” he said, “but just look at some of these videos.”) And last year, he devoted a segment of a video to flatearth theory, which he concluded “kind of makes sense.”
In fairness, Dawson is a far cry from partisan cranks like Alex Jones, the Infowars founder, who was barred by YouTube and other social networks last year for
hate speech. Most of Dawson’s videos have nothing to do with conspiracies and many are harmless entertainment. But the popularity of Dawson’s conspiracy theories illuminates the challenge YouTube faces in cleaning up misinformation. On
Facebook,
Twitter and other social platforms, the biggest influencers largely got famous somewhere else (politics, TV, sports) and have other vectors of accountability. But YouTube’s stars are primarily homegrown, and many feel — not entirely unreasonably — that after years of encouraging them to build their audiences with viral stunts and baseless rumor-mongering, the platform is now changing the rules on them.
Innocent or not, Dawson’s videos contain precisely the type of viral misinformation that YouTube now says it wants to limit. And its effort raises an uncomfortable question: What if stemming the tide of misinformation on YouTube means punishing some of the platform’s biggest stars?