YouTube pushed Trump supporters toward voter fraud videos, study finds

Updated
Bill Clark

YouTube’s recommendation system may have boosted interest in false claims about voter fraud for several weeks between the 2020 presidential election and the Jan. 6 riot in the U.S. Capitol, according to a study published Thursday based on data from the crucial period.

Researchers analyzed YouTube usage from 361 people who signed up before the election to participate in the study. They found that participants most skeptical about the legitimacy of the election were disproportionately more likely to get fraud-related video recommendations from YouTube than participants who weren’t skeptical.

The study adds to a debate within tech companies and among academics about online echo chambers and “filter bubbles,” in which people take in information only that they agree with — either because that’s what they want or that’s what tech platforms give them or both.

The researchers found that those most skeptical of the election results got three times as many video recommendations for election fraud than the least likely participants, though in absolute terms the number of videos was small: 12 videos versus four videos among about 410 recommendations.

James Bisbee, the study’s lead author and until recently a researcher at New York University, said YouTube’s recommendations helped to create a path for people already open to conspiracy theories spread by then-President Donald Trump.

“The first step in that pathway is open,” he said. “The people most likely to believe these narratives were being suggested more content about them.”

The study, co-written with other researchers from New York University, was published in the Journal of Online Trust and Safety. Since completing the work, Bisbee has since joined Vanderbilt University as an assistant professor of political science and data science.

Scrutiny of YouTube’s recommendation system, which suggests videos users should watch next, gained momentum around 2018, most notably when a former employee came forward with allegations that the system’s end goal of keeping people on its platform had pushed people toward misinformation and conspiracy theories, along with other questionable content. The company said in 2019 that it was overhauling its recommendation system.

YouTube received criticism in November 2020 for declining to take down videos claiming without evidence that Joe Biden stole the election from Trump. It said then that it wanted to give users room for “discussion of election results,” staking out a position that was less aggressive than that of Facebook and Twitter.

Now, though, YouTube’s written policies prohibit such videos.

“YouTube doesn’t allow or recommend videos that advance false claims that widespread fraud, errors, or glitches occurred in the 2020 U.S. presidential election,” the company said in a statement in response to the latest study.

It disputed the study’s findings, saying that some of the video recommendations may have come not from its algorithm but from users’ decisions to subscribe to certain channels. Other research has found that its users are rarely recommended videos from extremist channels they don’t subscribe to.

“While we welcome more research, this report doesn’t accurately represent how our systems work,” the company said. “We’ve found that the most viewed and recommended videos and channels related to elections are from authoritative sources, like news channels.”

YouTube is getting more scrutiny for its role in the 2020 election and politics. A study published last week and funded by its parent company, Google, found that “pre-bunking” videos may help people spot weak arguments online. It has been slow to take seriously the threat of violent white nationalists, according to a new book by Bloomberg News journalist Mark Bergen.

Homa Hosseinmardi, a senior research scientist at the University of Pennsylvania’s Computational Social Science Lab, said she wasn’t convinced that the latest study proved there was systematic bias in the YouTube recommendation engine. She said part of the difficulty was taking into account users’ browsing history, which was available for only 153 of the participants

She said that regardless of the recommendation’s secret recipe, there remain ways to counter misinformation, such as disqualifying some videos from being recommended entirely or taking down videos, as the company has done.

“Or the question is: ‘What should YouTube host?’” she said in an email. That, she said, is “more of a content moderation problem.”

The latest study includes several qualifications. It does not explore whether YouTube videos about voter fraud changed people’s beliefs or altered their behavior, and it doesn’t suggest that the platform was a leading cause of radicalization among those in the Jan. 6 mob or among their sympathizers. The echo chambers on cable TV are even more extreme than those on social media, the authors write, citing prior research by others.

But the study is notable for how it measures the effect of YouTube’s recommendation engine, a tricky factor to isolate from users’ own preferences.

The finding may be an underestimate, researchers wrote, because their sample was disproportionately young and liberal, with Trump supporters underrepresented. YouTube also removed about 8,000 channels for election misinformation while the study was ongoing.

YouTube generally does not provide much data to outside researchers, so independent data about its platform is still rare. And the study covered an especially important time for American democracy, from Oct. 29 to Dec. 8, 2020, around the time the company began a wave of misinformation takedowns.

“We can’t go back in time and we can’t do this again, and furthermore we can’t even look back at what was more generally on the platform during this period,” Bisbee said.

“It was the right place at the right time to be collecting this data,” he said.

Advertisement