By Laura Hazard Owen, for NiemanLab
Plus: “Subtly inducing people to think about the concept of accuracy decreases their sharing of false and misleading news relative to accurate news,” and the scariest deepfakes of all.
Do people mainly share misinformation because they get distracted? A new working paper suggests that “most people do not want to spread misinformation, but are distracted from accuracy by other salient motives when choosing what to share.” And when the researchers — Gordon Pennycook, Ziv Epstein, Mohsen Mosleh, Antonio Arechar, Dean Eckles, and David Rand — DM’d Twitter users who’d shared news from unreliable websites, they found that “subtly inducing people to think about the concept of accuracy decreases their sharing of false and misleading news relative to accurate news.”
“Our accuracy message successfully induced Twitter users who regularly shared misinformation to increase the quality of the news they shared,” the authors write. They suggest the effect of the intervention may not last long, but the platforms could increase it by continuing to offer nudges:
What are conspiracy theorists like? Researchers tracked Reddit users over eight years to figure out how they ended up as active members of the r/conspiracy subreddit. We undertook an exploratory analysis using a case control study design, examining the language use and posting patterns of Reddit users who would go on to post in r/conspiracy. (the r/conspiracy group). We analyzed where and what they posted in the period preceding their first post in r/conspiracy to understand how personal traits and social environment combine as potential risk factors for engaging with conspiracy beliefs.
Our goal was to identify distinctive traits of the r/conspiracy group, and the social pathways through which they travel to get there. We compared the r/conspiracy group to matched controls who began by posting in the same subreddits at the same time, but who never posted in the r/conspiracy subreddit. We conducted three analyses.
First we examined whether r/conspiracy users were different from other users in terms of what they said. Our hypothesis was that users eventually posting in r/conspiracy would exhibit differences in language use compared to those who do not post in r/conspiracy, suggesting differences in traits important for individual variation.
Second, we examined whether the same set differed from other users in terms of where they posted. We hypothesized that engagement with certain subreddits is associated with a higher risk of eventually posting in r/conspiracy, suggesting that social environments play a role in the risk of engagement with conspiracy beliefs.
Third, we examined language differences after accounting for the social norms of where they posted. We hypothesized that some differences in language use would remain after accounting for language use differences across groups of similar subreddits, suggesting that some differences are not only a reflection of the social environment but represent intrinsic differences in those users.
There were “significant differences” between the 15,370 r/conspiracy users and a 15,370-user control group. Here are some of those differences:
- Users who ended up in the r/conspiracy group used more words related to “crime,” “stealing,” and “law,” compared to the control group. Control group users used more words related to “friends,” “optimism,” and “affection.”
- r/conspiracy users were also much more active in the Politics subreddit, “where there were 2.4 times as many r/conspiracy users as control users that posted in at least one subreddit in the group,” and they posted five times as many comments in the Politics community overall.
- They were overrepresented in what the researchers refer to as “Drugs and Bitcoin” Reddit, and what they refer to as “Toxic Reddit, where “the subreddits in which r/conspiracy posters are also most over-represented include several that have since been banned for questionable content, such as r/WhiteRights and r/fatpeoplehate.”
If you’d like to spend more time with conspiracy theorists, CNN’s Rob Picheta took a trip to the third annual Flat Earth International Conference. Here’s Mark Sargent, the “godfather of the modern flat-Earth movement” and the subject of the 2018 Netflix documentary “Behind the Curve”:
“I don’t say this often, but look — there is a downside. There’s a side effect to flat Earth … once you get into it, you automatically revisit any of your old skepticism…I don’t think [flat Earthers and populists] are just linked. They kind of feed each other … it’s a slippery slope when you think that the government has been hiding these things. All of a sudden, you become one of those people that’s like, ‘Can you trust anything on mainstream media?’”
What if the most effective deepfake video is actually a real video? And to end on a downer, just like last week, here’s Craig Silverman:
Everyone thinks there will be a rather effective deepfake video, but I wonder if, in the next year, will we see something that is actually authentic being effectively dismissed as a deepfake, which then causes a mass loss of trust.
If there is an environment in which you can undermine not what is fake, and make it convincing, but undermine what is real — that is even more of a concern for me.
By Laura Hazard Owen, for NiemanLab