1242043574

YouTube’s recommendations make election denial content objectionable

YouTube’s recommendation algorithm drove more videos about election fraud to people who were already skeptical about the legitimacy of the 2020 election, according to a new study. There were a relatively small number of videos about election fraud, but the most skeptical YouTube users watched three times as many of them as the least skeptical users.

“The more exposed you are to these types of stories about the election … the more likely you are to recommend content about that story,” said study author James Bisbee, now a political scientist at Vanderbilt University.

In the wake of his 2020 election loss, former President Donald Trump promoted the false claim that the election was stolen, calling for a repeat election as recently as this week. Although claims of voter fraud have been widely debunked, promoting debunked claims, whether in podcasts, movies or online videos, continues to be a profitable tactic for traditional media figures.

Bisbee and his research team are running a study to determine how often malicious content is recommended to users and within that window. “We overlapped with the US presidential election and then spread misinformation about the outcome,” he said. So they took advantage of the time to look specifically at the way the algorithm recommended content related to election fraud.

The research team surveyed 300 people with questions about the 2020 election — asking them, for example, how worried they were about fraudulent ballots and foreign government interference. People were surveyed between October 29 and December 8, and people surveyed after election day were also asked whether the election result was legitimate. The research team also tracked participants’ experiences on YouTube. Each person was assigned a video to start with, then given a way to follow them through the site — for example, clicking on the second recommended video each time.

The team examined all the videos shown to the participants and identified those related to election fraud. They also categorized the stance those videos took on election fraud — if they were neutral on claims of election fraud or endorsed election misinformation. The top videos associated with promoting claims of election fraud are videos of press briefings from the White House Channel and videos from Fox News affiliate NewsNow.

The analysis found that people who were most skeptical of the election had an average of eight more recommended videos than people who were least suspicious of election fraud. Skeptics watched an average of 12 videos and non-skeptics watched an average of four. The types of videos also varied – videos viewed by skeptics were more likely to endorse claims of election fraud.

People who participated in the study were more liberal, better educated, and more likely to identify as Democrats than the United States population as a whole. So their media diet and digital information environment may already be heavily skewed to the left — meaning skeptics in this group were shown fewer election fraud videos than skeptics in a more conservative group, Bisbee said.

But the number of cheating videos in the study was small, overall: People watched a total of 400 videos, so even 12 videos was a small percentage of their entire YouTube diet. People aren’t inundated with misinformation, Bisbee said. The number of election fraud videos on YouTube further decreased in early December after the platform announced it would remove videos alleging voter fraud in the 2020 election.

YouTube has launched several features to fight misinformation, including censoring videos that violate its rules and promoting official sources on the homepage. Notably, YouTube spokeswoman Elena Hernandez reiterated in an email to the edge Platform policy does not allow videos that falsely claim fraud in the 2020 elections. However, YouTube has more permissive policies around misinformation than other platforms and has taken longer to implement policies around misinformation, according to the Misinformation and the 2020 Elections report.

Broadly, YouTube disputes the idea that its algorithm is systematically spreading misinformation. “While we welcome further research, this report does not definitively indicate how our systems are performing,” Hernandez said in a statement. “We found that the most viewed and recommended videos and channels related to the election were from official sources such as news channels.”

Essentially, Bisbee doesn’t see YouTube’s algorithm as good or bad, but rather recommends content to people who are likely to respond to it. “If I’m a country music fan and I want to find new country music, an algorithm that suggests content that interests me is a good thing,” he says. But when the content is extremist disinformation instead of country music, the same system can create obvious problems.

In an email to to the edge, Hernandez pointed to other research that found YouTube doesn’t steer people toward extremist content — a study from 2020 found recommendations don’t increase engagement with far-right content. But the findings from the new study contradict some previous findings, Bisbee said, particularly the consensus among researchers that people self-select into disinformation bubbles rather than being driven by algorithms.

Specifically, Bisbee’s team saw a small but significant push from the algorithm toward misinformation for people who believed misinformation more. This may be specific to election fraud, but the study does not say whether the same is true for other types of misinformation. However, this means that there is much more to learn about the role that algorithms play.

Leave a Comment

Your email address will not be published.