YouTube Will Fight Spread of Conspiracy Videos With Links to Wikipedia
It has promoted videos declaring a U.S. school shooting survivor to be a “crisis actor” and hosted others labelling a massacre in Las Vegas a “false flag” operation. But now Google-owned YouTube has a plan, and it involves Wikipedia.
Speaking Tuesday at the Texas-based South by Southwest (SXSW) festival, the website’s CEO Susan Wojcicki revealed its new system will help curb the spread of conspiracy theories and hoaxes by linking to “information cues” on the online encyclopedia. The update, she said, would be rolling out in the coming months.
“When there are videos that are focused around something that’s a conspiracy—and we’re using a list of well-known internet conspiracies from Wikipedia—then we will show a companion unit of information from Wikipedia showing that here is information about the event,” Wojcicki explained, as reported by The Verge.
See all of the best photos of the week in these slideshows
The snippets of Wikipedia articles—which anyone can edit— will appear around topics that spark debate, with Wojcicki using the 1969 moon landing as an example. It is believed that other unnamed websites will also be used as sources of news.
YouTube has faced intense criticism for promoting known conspiracy theorists on its platform, including InfoWars’ Alex Jones.
In late February, following the school shooting in Florida, it promoted a video to the No. 1 trending spot which claimed survivor David Hogg, 17, was a paid “crisis actor." This week, as reported by Buzzfeed, a search for “Austin explosions” while in incognito mode brought up a video claiming “Antifa” was responsible.
Earlier this year, Wojcicki pledged that YouTube would be hiring 10,000 new employees to help address content that was in violation of its content policies.
While YouTube turned to Wikipedia for answers, the website itself has noted that it is not always the most reliable source, especially when dealing with breaking news.
“Wikipedia is an encyclopedia, not a newspaper,” it clearly states online. “Our processes and principles are designed to work well with the usually contemplative process of building an encyclopedia, not sorting out the oft-conflicting and mistaken reporting common during disasters and other breaking news events.”
In a series of Twitter updates Wednesday, the executive director of Wikipedia’s parent company, Katherine Maher, echoed similar concerns about reliability. “We don’t want you to blindly trust us,” Maher wrote. “Sure, we’re mostly accurate—but not always! We want you to read Wikipedia with a critical eye. Check citations! Edit and correct inaccurate information! You can’t do that in a simple search result.”
Academic research into how YouTube aids the spread of conspiracies, hoaxes and disinformation shows the problem is very real, and difficult to combat.
Jonathan Albright, director of research at Columbia University's Tow Center for Digital Journalism, detailed extensively in a February blog post how the website’s algorithms, which surface related content, are only one part of the complex issue. He said when it comes to fake news "all roads seem to eventually lead to YouTube."
"Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value,” he wrote via blogging platform Medium. "The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.
“In other words, due to the increasing depth of the content offerings and ongoing optimization of YouTube’s algorithms, it’s getting harder to counter these types of campaigns with real, factual information.
"The mass shootings in particular are especially troubling. The experiences of the least fortunate among us—including tragedy survivors, children, and their families— are being used to algorithmically profit from the most impressionable."