Facebook is subverting democracy again — and it’s doing it on purpose
How democracy can be subverted online
According toThe New York Times, earlier this year US intelligence officials warned Russia was interfering in the 2020 presidential campaign, with the goal of seeing President Donald Trump re-elected.
This was corroborated byfindingsfrom the US Brennan Centre for Justice. A research team led by journalism and communications professor Young Mie Kim identified a range of Facebook troll accounts deliberately sowing division “by targeting both the left and right, with posts to foment outrage, fear, and hostility.”
Most were linked to Russia’s Internet Research Agency (IRA),the companyalso behind a 2016 US election influence campaign. Kimwrotethe troll accounts seemed to discourage certain people from voting, with a focus on swing states.
This month, Facebookannounceda ban (across both Facebook and Instagram, which Facebook owns) on groups and pages devoted to the far-right conspiracy group QAnon. It alsoremoveda network of fake accounts linked to a conservative US political youth group, for violating rules against “coordinated inauthentic behavior.”
However, despite Facebook’srepeated promisesto clamp down harder on such behavior — andoccasionalefforts to actually do so — the company has beenwidelycriticizedfor doing far too little to curb the spread of disinformation, misinformation, and election meddling.
According to aUniversity of Oxford study, 70 countries (including Australia) practiced either foreign or domestic election meddling in 2019. This was up from 48 in 2018 and 28 in 2017. The study said Facebook was “the platform of choice” for this.
The Conversation approached Facebook for comment regarding the platform’s use by political actors to influence elections, including past US elections. A Facebook spokesperson said:
When Facebook favored one side
Facebook has drawn widespread criticism for its failure to remove posts that clearly violate its policies on hate speech, includingpostsby Trump himself.
The company openlyexemptspoliticians from its fact-checking program and knowingly hosts misleading content from politicians, under its “newsworthiness exception.”
When Facebook tried to clamp down on misinformation in the aftermath of the 2016 presidential elections,ex-Republican stafferturned Facebook executive Joel Kaplan argued doing so would disproportionately target conservatives, the Washington Postreported.
The Conversation asked Facebook whether Kaplan’s past political affiliations indicated a potential for conservative bias in his current role. The question wasn’t answered.
Facebook’s board also now features amajor Trump donorand vocal supporter, Peter Thiel. Facebook’s chief executive Mark Zuckerberg has himself been accused ofgetting “too close”toTrump.
Moreover, when the US Federal Trade Commission investigated Facebook’s role in the Cambridge Analytica scandal, it wasRepublican votesthat saved the company from facing antitrust litigation.
Overall, Facebook’s model has shiftedtowards increasing polarization. Incendiary and misinformation-laden posts tend to generate clicks.
As Zuckerberg himselfnotes, “when left unchecked, people on the platform engage disproportionately” with such content.
Over the years, conservatives have accused Facebook ofanti-conservative bias, for which the company facedfinancial penalties by the Republican Party. This is despite research indicatingno such bias existson the platform.
Fanning the flames
Facebook’saddictivenews feed rewards us for simply skimming headlines, conditioning us to react viscerally.
Its sharing features have been found topromote falsehoods. They cantrick usersinto attributing news to their friends, causing them to assign trust to unreliable news sources. This provides a breeding ground forconspiracies.
Studieshave also shown social media to be an ideal environment for campaigns aimed at creating mistrust, which explains the increasingerosion of trust in science and expertise.
Worst of all are Facebook’s “echo chambers,” which convince people that only their own opinions are mainstream. This encourages hostile “us versus them” dialogue, which leads to polarization. This patternsuppresses valuable democratic debateand has been described as anexistential threat to democracy itself.
Meanwhile, Facebook’s staff hasn’t been shy about skewing liberal, even suggesting in 2016 that Facebook work toprevent Trump’s election. Around 2017, they proposed a feature called “Common Ground,” which would have encouraged users with different political beliefs to interact in less hostile ways.
Kaplan opposed the proposition, according toThe Wall Street Journal, due to fears it could trigger claims of bias against conservatives. The project was eventually shelved in 2018.
Facebook’s track record isn’t good news for those who want to live in a healthy democratic state. Polarization certainly doesn’t lead to effective political discourse.
While severalblogpostsfrom the company outline measures being taken to supposedly protect the integrity of the 2020 US presidential elections, it remains to be seen what this means in reality
This article is republished fromThe ConversationbyMichael Brand, Adjunct A/Prof of Data Science and Artificial Intelligence,Monash Universityunder a Creative Commons license. Read theoriginal article.
Story byThe Conversation
An independent news and commentary website produced by academics and journalists.An independent news and commentary website produced by academics and journalists.
Get the TNW newsletter
Get the most important tech news in your inbox each week.