YouTube’s moves against election misinformation were potent

Video-streaming giant’s stricter policies resulted in fewer false and misleading videos on other social networks

YouTube’s stricter policies against election misinformation were followed by sharp drops in the prevalence of false and misleading videos on Facebook and Twitter, according to new research released Thursday, underscoring the video service’s power across social media.

Researchers at the Center for Social Media and Politics at New York University found a significant rise in election fraud YouTube videos shared on Twitter immediately after the Nov. 3 election. In November, those videos consistently accounted for about one-third of all election-related video shares on Twitter. The top YouTube channels about election fraud that were shared on Twitter that month came from sources that had promoted election misinformation in the past, such as Project Veritas, Right Side Broadcasting Network and One America News Network.

But the proportion of election fraud claims shared on Twitter dropped sharply after Dec. 8. That was the day that San Bruno-based YouTube said it would remove videos that promoted the unfounded theory that widespread errors and fraud changed the outcome of the presidential election. By Dec. 21, the proportion of election fraud content from YouTube that was shared on Twitter had dropped below 20% for the first time since the election.

The proportion fell further after Jan. 7, when YouTube announced that any channels that violated its election misinformation policy would receive a “strike,” and that channels that received three strikes in a 90-day period would be permanently removed. By Inauguration Day, the proportion was around 5%.

The trend was replicated on Facebook. A postelection surge in sharing videos containing fraud theories peaked at about 18% of all videos on Facebook just before Dec. 8. After YouTube introduced its stricter policies, the proportion fell sharply for much of the month, before rising slightly before the Jan. 6 riot at the Capitol. The proportion dropped again, to 4% by Inauguration Day, after the new policies were put in place Jan. 7.

To reach their findings, researchers collected a random sampling of 10% of all tweets each day. They then isolated tweets that linked to YouTube videos. They did the same for YouTube links on Facebook, using a Facebook-owned social media analytics tool, CrowdTangle.

From this large data set, the researchers filtered for YouTube videos about the election broadly, as well as about election fraud using a set of keywords like “Stop the Steal” and “Sharpiegate.” This allowed the researchers to get a sense of the volume of YouTube videos about election fraud over time, and how that volume shifted in late 2020 and early 2021.

Misinformation on major social networks has proliferated in recent years. YouTube in particular has lagged behind other platforms in cracking down on different types of misinformation, often announcing stricter policies several weeks or months after Facebook and Twitter. In recent weeks, however, YouTube has toughened its policies, such as banning all anti-vaccine misinformation and suspending the accounts of prominent anti-vaccine activists, including Joseph Mercola and Robert F. Kennedy Jr.

Megan Brown, a research scientist at the NYU Center for Social Media and Politics, said it was possible that after YouTube banned the content, people could no longer share the videos that promoted election fraud. It is also possible that interest in the election fraud theories dropped considerably after states certified their election results.

But the bottom line, Brown said, is that “we know these platforms are deeply interconnected.” YouTube, she pointed out, has been identified as one of the most-shared domains across other platforms, including in both of Facebook’s recently released content reports and NYU’s own research.

“It’s a huge part of the information ecosystem,” Brown said, “so when YouTube’s platform becomes healthier, others do as well.”

This article originally appeared in The New York Times.

SF art school investigates theater class practice that had students undressing together

‘I remember being mortified and humiliated’

By Ida Mojadad
Wine in a can: San Francisco startup backed by music heavyweights

Jay-Z and The Chainsmokers backing this year’s hit holiday gift

By Jeff Elder
Is the future of farming moving indoors?

Bay Area startups are using tech to grow food in the face of climate change

By Jessica Wolfrom