How to stop the spread of disinformation before it’s even shared

Crowdsourcing from a group of 1128 users, researchers were able to segment groups as small as 10 people online who could accurately determine whether an article was fake or not – by the way as well as professional fact checkers. Complemented by algorithms, a system like this could be trained to identify fake news at the speed and scale at which it spreads.

Additionally, open source of these verification methods so that they are sufficiently verifiable and transparent to be easily understood could help mitigate allegations of bias and censorship. A first attempt in this direction can be seen on Twitter. Bird watching, which leverages the community to report misinformation tweets; the system is new and flawed, and there are clearly ways to play around (a problem with any verification system), but this is an important first attempt.

But who determines the truth?

Each of these three interventions requires someone, somewhere, to determine what is true or what is of high quality. This “basic” truth is an essential piece of the puzzle, but it is an idea that is increasingly difficult to tackle.

Controlling the narrative will always be controversial, and any system that attempts to fix disinformation will be attacked for partisan bias. Indeed, extreme partisanship is directly associated with the sharing of fake news. Social media appears to be particularly effective in drawing partisan battle lines around growing issues, although the stakes are not inherently partisan.

But this is another manifestation of an age-old problem: how to check knowledge? And how could we do it fast enough to be reliable? Who in society do we trust to establish the truth? Here we are venturing into delicate epistemological territory, but with a precedent.


Subscribe to WIRED and stay smart with more of your favorites Ideas writers.

Let’s take a look at other services we regularly use for fact-checking – flawed but powerful systems that we rely on. Google and Wikipedia have, by and large, earned a reputation for effectively helping people find accurate information. We generally trust them because they have verification and sourcing systems built into their design.

The frictionless design of today’s social network has undermined the prerequisite for democratic functioning: shared truths.

Trust and faith in the basic journalistic process of verification is implicit in our three recommendations. Journalism is far from perfect. The New York Times is sometimes wrong. Just as all media entities struggle with the selective interpretation of events, as well as editorial influence over the tone and content of the stories. But the inherent value of validated information is critical infrastructure that has been undermined by social media. Social media posts are not news articles, although they have come to resemble them in our news feeds. Verifying new information is at the heart of any functioning democracy, and we must recreate the frictions that were previously provided by the journalistic process.

On the horizon, new technologies that will allow both decentralization and end-to-end encryption of social media, free from any moderation. As these new tools reach their scale, viral rumors will become even more difficult to debunk, and the problem with the supply of misinformation and misinformation will only get worse. We need to consider how these tools could be designed to rebalance the flow of precise information now, before we lose our ability to do so.

This responsibility falls at least partially on our shoulders as individuals. We must be vigilant in identifying inaccuracies and in finding established and reputable sources of knowledge, both academic and journalistic. Too much institutional skepticism is toxic to our common reality. We can redouble our efforts to find ways to find the truth together with care and compassion. But platforms can help, and should help, orient the design of our shared spaces towards verifiable facts.

Data visualizations by Tobias Rose-Stockwell

More WIRED stories

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *