Negligence, not politics, is the root of the sharing of false information


you do not need a study to find out that disinformation is rampant on social media; a quick search on ” vaccines” or “climate change»Will confirm this. A more compelling question is why. It is clear that at a minimum, there are contributions from organizations disinformation campaigns, frenzied and dubious political supporters algorithms. But beyond that, there are still plenty of people who choose to share things that even a cursory examination would show as garbage. What motivates them?

This was the question that motivated a small international team of researchers who decided to look into how a group of US residents decided which news to share. Their results suggest that some of the standard factors people refer to when explaining the tsunami of disinformation – inability to assess information and partisan bias – don’t have as much influence as most of us think. Instead, much of the blame is directed at people simply to not pay attention.

The researchers conducted a number of experiments quite similar to obtain the details of the sharing of disinformation. This involved panels of US-based participants recruited either by Mechanical Turk or through a population survey that provided a more representative sample of the United States. Each panel consisted of several hundred to over 1000 individuals, and the results were consistent across different experiments, so there was some degree of data reproducibility.

To do the experiments, the researchers put together a set of headlines and main phrases from reports that had been shared on social media. The whole thing was evenly mixed between titles that were clearly true and clearly false, and each of those categories was again split between titles that favored Democrats and those that favored Republicans.

One thing that was clear is that people are generally able to judge the accuracy of titles. There was a 56 percentage point gap between how often a specific headline was considered true and how often a fake headline was. People are not perfect – they still get it wrong quite often – but they are clearly a little better at it than they are credited with.

The second thing is that ideology doesn’t really seem to be a major factor in judgments about the accuracy of a title. People were more likely to rate the titles in accordance with their policy, but the difference here was only 10 percentage points. This is significant (both socially and statistically), but it is certainly not a big enough gap to explain the flood of disinformation.

But when the same people were asked if they would share these same stories, politics played a big role and the truth receded. The difference in intent to split between real and fake headlines was only 6 percentage points. Meanwhile, the gap between whether or not a headline agrees with a person’s policy saw a 20 percentage point gap. In concrete terms, the authors are looking at the fake headline “Over 500 ‘migrant caravanners’ arrested with suicide vests.” Only 16 percent of conservatives in the survey’s population held it true. But more than half of them were likely to share it on social networks.

Overall, participants were twice as likely to consider sharing a fake headline aligned with their policy than they were to call them the correct one. Yet surprisingly, when the same population was asked whether it was important to only share specific content on social media, the most common response was “extremely important.”

So people can distinguish what is correct, and they say it’s important in deciding what to share. But when it comes to making that choice, precision doesn’t seem to matter much. Or, as the researchers say, something in the context of social media is diverting people’s attention from worrying about the truth, in favor of the desire to get likes and signal their ideological affiliation.

To find out if this could be the case, the researchers tweaked the experiment slightly to remind people of the importance of accuracy. In their modified survey, they started by asking people to rate the accuracy of a non-partisan news headline, which should make participants more aware of the need and the process of making these kinds of judgments. Those who received this prompt were less likely to say they were interested in sharing false headlines, especially when the latter agreed with their policy. Similar things happened when people were simply asked about the importance of accuracy before taking the survey, rather than after.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *