The Social Costs of Not Sharing Fake News

Naturalist and biologist Charles Darwin once wrote, “With those animals [that benefited] by living in close association, the individuals [that] took the greatest pleasure in society would best escape various dangers, [while] those that cared least for their comrades, and lived solitary, would perish in greater numbers.”

Although humans today may not rely on others for their physical survival as much as the animals of Darwin’s time did, group membership remains essential to one’s well-being. Individuals derive their sense of self-worth not only from their own characteristics but also from the groups – be it family members, friends, colleagues or online communities – with which they identify. But being part of a group comes with various rules, regulations and expectations. Members are expected to adhere to them, or risk being cast out from the herd.

Given the leading role of online groups in the dissemination of fake news, my co-authors* and I wanted to investigate how group-level factors can motivate individuals to spread misinformation online. Does the desire to belong come at the cost of misinformation, where users share content even when it is deemed to lack validity? Our study aimed to uncover the social motivations behind this phenomenon and provide insights on how to effectively combat fake news.

Peer pressure

Group membership affords many psychological and social benefits – from access to resources to a sense of collective agency – while satisfying an individual’s fundamental need to belong. The belief that one’s community can both cater to its collective goals and provide strength and affirmation to individual members is the core tenet of why group membership is so vital to people.

This arrangement, however, comes with strings attached. The perks that group members enjoy are often linked to explicit or implicit rules they are expected to follow. Failure to conform may lead to social costs such as reduced interaction or exclusion, which can negatively affect the excluded member’s psychological well-being. Meanwhile, members who adhere to group norms are further integrated and may become key figures. This creates a strong motivation to conform, which can lead to the self-enforcement of group norms over time.

We hypothesised that similar dynamics underpin the spread of misinformation on social media, whereby group members interact more with others in the group who share the same fake news as them. Conversely, focal group members who share fake news will reduce their social interactions with deviant members who do not share the same fake news, driving them to the group’s margins.

Analysing why people share fake news

We examined fake news shared online in two digital field studies and five experiments. In the first field study, we tested whether individuals who did not share fake news stories experienced reduced social interaction with their online connections. Our data set consisted of 51,537 dyads of socially interacting Twitter users. We next tested whether the link between users sharing content in common and social interaction was stronger in the fake news ecosystem than in a random sample of Twitter users.

Subsequent experiments probed how group-level factors influence whether people spread misinformation online. For instance, we tested whether an individual’s failure to share fake news led to a reduced desire by social connections to interact with that individual. In this experiment, we asked participants to choose six false stories they would consider sharing, and then evaluate hypothetical friends who did or did not share their posts. This mimics social media users’ experiences, where they see a range of different posts, decide whether to share them and choose whether to engage socially with their connections.

The role of conformity

Collectively, our studies reveal that group members who do not conform to the behaviour of other members by sharing fake news are often subjected to social penalties. Specifically, our first study showed that the likelihood of social interaction between users who shared a fake news story and others who did not decreased over time.

Our second study built on this, showing that the relationship between sharing content in common and social interaction was significantly stronger in the fake news ecosystem than in a more representative sample of Twitter users. We also found that the social costs of not sharing fake news were higher than for not sharing hyper-partisan news, which suggests there may be more pressure to share fake news, perhaps to maintain a group’s epistemological view. The causal links of the observations were supported by further experiments, where subjects indicated a reduced desire to interact socially with connections who failed to share the same falsehoods as them.

On Twitter, we found that the social costs for failing to spread misinformation were stronger among political conservatives relative to liberals – yet we did not consistently observe this in our experiments. In our fifth study, we further clarified the relationship between politics and social costs, specifically, that partisan identification elevated the perceptions of social costs associated with not sharing fake news, which in turn increased the likelihood of people spreading falsehoods.

We also found that the social costs of not spreading misinformation can be as high as those for sharing ideologically opposed content. These results elucidate a worrying mechanism of group membership, which encourages the spread of misinformation and decreases the diversity of perspectives available online.

What can we do about it?

The spread of misinformation is escalating into a serious problem. Fake news contributes to rising political polarisation, breeds division, encourages malicious behaviour and festers contempt for democratic institutions and political outgroups. In recent years, misinformation has cast doubt on the integrity of democratic elections, downplayed the seriousness of Covid-19 and increased vaccine hesitancy.

There is a pressing need to understand why people share fake news in the first place. Our work underscores the role of conformity as a key psychological motivator in sharing fake news and stresses the importance of considering social affiliation needs as a predictor of spreading misinformation.

Our core contribution is identifying social costs as an important psychological driver of the spread of fake news. We also established that social costs are both a unique predictor above and beyond partisan identity as well a reason why partisan identity can influence sharing decisions. Further, we shed light on how social media groups can become echo chambers that reinforce increasingly polarised views within small online clusters due to conformity pressure and the avoidance of social costs.

How can the spread of misinformation be addressed? A wealth of evidence shows that “pre-bunking” – informing people of the ways in which they may be misinformed to reduce their susceptibility to future misinformation – could be an effective intervention. Prior research has shown that drawing people’s attention to the accuracy of news may reduce the spread of falsehoods. Our study highlights another means to reduce people’s susceptibility to misinformation: designing socially affirming interventions that lower the perceived risks of social costs.

Social affirmation can be integrated into the design of social media platforms to fight misinformation. For example, each Twitter user can be assigned a score for susceptibility to social conformity pressure based on the user’s sharing patterns and social network. Those susceptible to high conformity pressure could be targeted for affirming interventions.

The recognition of the influence of social costs is but a first step. Beyond individual behaviour, social costs can have wider implications, and further research could inform both the design of social media platforms as well as their regulation.

*Shikhar Anand, Indian Institute of Technology Delhi and Hemant Kakkar, Duke University’s Fuqua School of Business.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy