Videos will be paired with information in Spanish and English to impact spread of misinformation
YouTube said that it will add contextual information underneath election-related videos and in search results for the US midterms in the coming weeks, an initiative aimed at curbing the spread of falsehoods on the Google-owned video platform.
YouTube will roll out information panels in two languages, English and Spanish, when users search for information about the election, Leslie Miller, YouTube’s vice-president of government affairs and public policy, said Thursday in a blog post. The video site will also promote authoritative content from national and local news sources, and work to quickly remove videos that violate its community guidelines, including ones that mislead voters on how to vote, encourage interference in the democratic process, incite violence, or promote other types of election misinformation. YouTube said that it had already pulled a handful of videos that contained false claims of widespread fraud, errors and glitches in the US presidential election, though it did not specify how many.
“Over the years we’ve built policies, systems and teams that raise authoritative content and limit the spread of harmful misinformation,” Miller said in the blog post. “Whether it’s learning about when and where to vote, or finding information about political candidates, we take seriously our commitment to connecting viewers with trusted resources.” Starting on Election Day, the company said, people will see information about election results underneath videos and in search results related to the midterms, and will be able to track results live. But YouTube did not say whether its information panels would address false election-related rumors in real-time.
Misinformation on major social media platforms has proliferated in recent years, surfacing around especially newsworthy periods like elections as candidates fight for votes, and political pundits and social media influencers fight for attention and ad dollars. YouTube has historically lagged behind other platforms in cracking down on different types of misinformation, often announcing stricter policies weeks or months after platforms like Facebook and Twitter. YouTube’s announcement on Thursday detailing an expansion of its election misinformation policies follows similar pronouncements from Facebook, Twitter and TikTok last month. Its parent company, Alphabet Inc.’s Google, published a blog post in March reiterating the video platform’s longstanding policy to prohibit false claims about widespread fraud in the outcome of past US presidential elections.
When YouTube does take a strong stance on misinformation, research has shown that it makes a difference in terms of what ends up going viral across other social media. Last October, researchers at the Center for Social Media and Politics at New York University published a study describing how after Dec. 8, 2020, the day YouTube announced that it would remove videos promoting unfounded theories about election fraud and errors, it was followed immediately by sharp drops in the prevalence of false and misleading videos on Facebook and Twitter.