YouTube showing election-fraud videos to users skeptical about 2020 US polls

0
96
758D3B28F690C976D98B57D94B9Ef2Ba 1 2

New York, Sep 2

YouTube is showing more election-fraud videos to users already skeptical about the legitimacy of the 2020 US presidential election, a study has revealed, showing how its algorithms perpetuate existing misperceptions. The study, published in the Journal of Online Trust and Safety, found that those most skeptical of the election's legitimacy were shown three times as many election-fraud-related videos as were the least skeptical participants -- roughly eight additional recommendations out of approximately 400 videos suggested to each study participant.

The findings expose the consequences of a recommendation system that provides users with the content they want.

"For those most concerned about possible election fraud, showing them related content provided a mechanism by which misinformation, disinformation, and conspiracies can find their way to those most likely to believe them," said the authors of the study.

Importantly, these patterns reflect the independent influence of the algorithm on what real users are shown while using the platform.

"Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely determined by user choice," said James Bisbee, who led the study at New York University's Center for Social Media and Politics (CSMaP).

Nearly two years after the 2020 presidential election, large numbers of Americans, particularly Republicans, don't believe in the legitimacy of the outcome.

"Roughly 70 per cent of Republicans don't see Biden as the legitimate winner," despite "multiple recounts and audits that confirmed Joe Biden's win," the Poynter Institute said earlier this year.

While it's well-known that social media platforms, such as YouTube, direct content to users based on their search preferences, the consequences of this dynamic may not be fully realised.

"Many believe that automated recommendation algorithms have little influence on online aecho chambers' in which users only see content that reaffirms their preexisting views," said Bisbee, now an assistant professor at Vanderbilt University.

"This highlights the need for further investigation into how opaque recommendation algorithms operate on an issue-by-issue basis," said Bisbee.

--IANS

Comments are closed.