On January 8, two days after former president Donald Trump incited a mob attack on the U.S. Capitol, a 57-year-old Texas man tweeted a list of the penalties Trump would incur if he were impeached a second time. The listed penalties included the loss of Trump’s presidential pension, the loss of his Secret Service detail, and the loss of his ability to run again in 2024.
The tweet was reposted on a left-leaning Facebook page, at which point it went viral. The original tweet, which has since been deleted, accrued hundreds of thousands of “likes” and tens of thousands of retweets. While its assertions had the ring of truth, nearly all of its claims were either false or misleading.
“I think that part of the reason that tweet was shared by so many people is that it was consistent with something a lot of people wanted to believe, and they didn’t think about whether the information was actually true,” says Laura Scherer, PhD, a social psychologist and assistant professor at the University of Colorado.
Some of Scherer’s recent work has examined the spread of health misinformation online and particularly on social media. She says that there are a number of factors that cause people to believe in, and share, lies. While some of these factors involve a person’s preexisting knowledge or health literacy, other influences can lead even astute and knowledgeable people to buy into blatant falsehoods.
At a time when “alternative facts,” deepfakes, and other forms of misinformation are increasingly commonplace, it’s more important than ever for people to recognize the circumstances that make them vulnerable to false information. There’s new evidence that strong emotion — and, in particular, anger — may allow falsehoods to flourish.
Anger seems to reduce people’s skepticism, increase their confidence, and “streamline” their thinking.
Enragement is engagement
For a 2020 study appearing in the journal Experimental Psychology, researchers split 79 people into two groups. Both groups watched a short movie clip and then answered questions about its content. Some of those questions included false or misleading statements.
The catch: Before the movie quiz, the experimenters purposely angered the members of one group by interrupting them, insulting them, and otherwise treating them rudely.
When answering questions about the movie, the angry people were much more likely to endorse false statements than the people who had not been primed to feel bent out of shape. The angry people were also more confident in their incorrect answers than those who were not emotionally riled up. “Anger increased susceptibility to misinformation,” the study concluded.
“There’s a lot of research examining the impacts of emotion on brain structures and processing,” says Michael Greenstein, PhD, first author of the study and an assistant professor of psychology at Framingham State University in Massachusetts. Anger seems to reduce people’s skepticism, increase their confidence, and “streamline” their thinking — all of which may make them vulnerable to misinformation, he and his study coauthor explained in their paper.
The University of Colorado’s Scherer says she’s not surprised by this finding. “We know that people tend to pay attention to and share information that makes them feel outraged,” she says. “So if something is a bit outrageous, and especially if it’s consistent with a person’s point of view, people want to share that and they often don’t stop to consider if the thing is true.”
Why would the angry brain respond this way? Looked at through the lens of evolutionary psychology, anger is an emotion that people experience during periods of conflict or danger. At such times, it makes sense that the brain would prioritize quick, self-confident decisions and rallying others to one’s side over deliberation and honest communication.
Keeping bad information out of your head
Along with emotions like anger, there are dozens of other variables — many of which seem endemic to today’s online media landscape — that increase a person’s susceptibility to misinformation.
“People tend to be more persuaded by sources they see as like themselves,” Scherer explains. If a piece of information is coming to you from a friend — or from a stranger with whom you feel a friend-like affinity — you may be less likely to weigh the truth of that information before passing it on to the people in your network.
“We know that people tend to pay attention to and share information that makes them feel outraged.”
Scherer also says that the mere repetition of a false statement can make it seem more believable. “The more you repeat a message, the more likely it is that people will remember it and remember it as true,” she explains. “You see public figures using this technique to drive home their points.”
With so many ways for bad information to set up shop in a person’s head, it can feel as though disconnecting is the only way to escape the misinformation whirlpool. But Scherer and Greenstein offer some alternatives to tuning out.
“We know that if you prompt people to think about whether a piece of information is true or false, it can improve their accuracy in identifying misinformation,” Scherer says. This kind of scrutiny can be difficult to summon when your brain is skimming over headlines, posts, and tweets. But slowing down and taking a moment to activate your brain’s bullshit detector may help you weed out the blatant falsehoods.
Why Your Brain Loves Conspiracy Theories
Who believes and why, and whether conspiracism is really getting way worse
“Be particularly skeptical of information that causes you outrage,” she adds. “When you feel outrage coming on, check that information with other sources and maybe think more carefully about it before you share it.”
Greenstein says that expanding your network of trusted sources can also be helpful. “Read multiple, reliable sources on a topic,” he says. “Pay attention not only to where they agree but to where they disagree.” Even generally trustworthy people and outlets can get things wrong and make mistakes, he adds, and so it’s important to diversify your sources of information.
Finally, Scherer recommends — whenever possible — trying to find data or objective evidence to support your strongly held beliefs. You’re not going to apply this standard to every bit of new info that floats your way. But when it comes to those viewpoints that will have a major impact on your life — for example, whether to get the coronavirus vaccine — it’s worth taking extra time to seek out the facts.
“Feelings and worries can mislead us,” she adds. But there are ways to resist the undercurrent of bad information.