People share misinformation because of social media incentives, but that can be changed.
"After some adjustments to the reward structure of social networking platforms, users begin to share accurate, fact-based information." (Although the tweaks did involve paying people to do so).
By IAN ANDERSON , GIZEM CEYLAN and WENDY WOOD
Are social networks designed to reward people for acting badly?
The answer is clearly yes, given that the reward structure on social media platforms is based on popularity, as indicated by the number of responses (likes and comments) a post receives from other users. Black box algorithms further amplify the spread of posts that have attracted attention.
Sharing widely read content, by itself, is not a problem. But it becomes a problem when the design prioritizes controversial and attention-grabbing content. Given the design of social networking sites, users create habits to automatically share the most attractive information, regardless of its accuracy and potential harm. Offensive statements, outgroup attacks and fake news are amplified, and misinformation often spreads farther and faster than the truth.
We are two social psychologists and a marketing scholar. Our research, presented at the 2023 Nobel Prize Summit , shows that social networks actually have the ability to create habits in users to share high quality content . After some adjustments to the reward structure of social networking platforms, users begin to share accurate, fact-based information.
The problem with habit-driven misinformation sharing is significant. Facebook's own research shows that being able to share already shared content with a single click generates misinformation. Thirty-eight percent of text misinformation views and 65 percent of photo misinformation views come from content that has been shared twice, meaning a part of a part of a part of a part of an original post. The biggest sources of misinformation, such as Steve Bannon's War Room , exploit social media popularity optimization to promote controversy and misinformation beyond their immediate audience.
Reward redirection
To investigate the effect of a new reward structure, we awarded financial rewards to some users for sharing accurate content and not sharing misinformation. These financial rewards simulated the positive social feedback, such as likes, that users typically receive when sharing content on platforms. In essence, we created a new reward structure based on accuracy rather than attention.