Facebook Did Not Do Enough To Stop Election Misinformation, Report Says


Facebook missed billions of opportunities to tamp down with disinformation ahead of the 2020 presidential election. Avaaz, an advocacy group that researches disinformation online.

Avaaz researchers analyzed 100 of the most popular Facebook pages that repeatedly ran false claims. According to their analysis, the posts shared by these pages have been viewed more than 10 billion the time difference between March and October. The report also refutes Facebook’s fact-checking policies, noting that “The 100 Biggest False or Misleading Stories Related to the 2020 Election” were viewed 162 million times in three months, even as Facebook’s fact-checkers debunked the allegations.

“Although Facebook claims to slow the spread of fake news once verified and tagged, this finding clearly shows that its current policies are not enough to prevent bogus and misleading content from going viral and accumulating millions of views,” says The report. .

The report comes just days before Mark Zuckerberg faced questions from members of Congress about Facebook’s role in election misinformation. Zuckerberg, along with Twitter CEO Jack Dorsey and Google CEO Sundar Pichai, are at a hearing of the House Energy and Commerce Committee on March 25.

Facebook did not immediately respond to a request for comment, but a company spokesperson disputed Avaaz’s findings in a statement. , saying that the researchers’ methodology was “flawed”. The spokesperson also highlighted Facebook’s efforts militarized social movements, including QAnon.

But the report says Facebook has waited too long to implement many of its most significant changes, including “emergency” features that immediately after the election. Likewise, the company’s crackdown on QAnon and other groups that glorified violence also came too late, according to Avaaz. The most problematic groups had “already grown in popularity” by the time Facebook took action against them.

In addition, Facebook again prioritized piecemeal and whack-a-mole approaches – over individual content for example – over structural changes to its recommendation algorithm and organizational priorities, thus not enforcing not the most powerful tools available to truly protect its users and democracy, ”Avaaz writes. Zuckerberg announced that he wanted to reduce the amount of political content in the News Feed and that Facebook would definitely end algorithmic recommendations for political groups.

Members of Congress are likely to raise many of the same issues highlighted in the report. The titled “Disinformation Nation: The Role of Social Media in Promoting Extremism and Disinformation,” should cover how social media companies have handled misinformation about the 2020 elections and false claims about vaccines against the coronavirus.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *