Recently, I wrote about why many believe Hillary Clinton is in poor health. As a result, readers sent me tweets providing more "evidence" — this time, that her eyes point in different directions.
To be clear, there is no evidence that Clinton is in failing health. But this belief in entirely unfounded narratives isn't new; American politics has spawned a variety of out-there theories adopted by people on both sides of the aisle — that Sept. 11 was an inside job, for example, or that President Obama was born in Kenya.
There's no hard evidence for these stories, but sizable shares of the American public have believed them. It turns out that it's really hard to change people's minds once they believe a piece of misinformation — even when there's better, contradictory data. In fact — and troublingly for those of us who write about these things — even refuting these rumors in a news story (yes, like we have done) can backfire and make people believe them more.
In addition, political science research has found that some of the most immovable forces in American politics — polarization, low trust and opportunistic politicians — seem destined to keep these evidence-free ideas in Americans' heads.
How misinformation gets lodged in our brains
To understand why it's so hard to get our minds to unlatch from some of these ideas, it helps to know why people latch on in the first place. Here's a quick rundown of the possibilities that political science and psychological research have uncovered:
Low trust. Here's a factor that makes obvious sense. Various studies have shown that a lack of trust, whether in other people or governmental figures, is associated with a willingness to believe conspiracy theories, as Slate outlined in 2013. In a 2015 study, researchers from the University of Minnesota and Colorado State University likewise found that low trust in a variety of areas (in government, the media, law enforcement, and other people) is correlated with a person's willingness to believe conspiracy theories — and in particular, theories that make the other party look bad.
It's not hard to see how people who don't trust the government also buy into theories about cover-ups and shadowy plots: "if they believe the world is a trustworthy place, they are less able to convince themselves that political rivals are engaging in nefarious, secretive plots," the authors of that 2015 study wrote.
(However, the causality may flow in both directions here; another recent study from researchers at Boston University showed that exposure to conspiracy theories made people more likely to distrust the government.)
Motivated reasoning. This is a bit of jargon for the idea that people will interpret new information in a way that confirms their existing worldviews and identities — even when that information contradicts how they think. For example, when a new study shows evidence of climate change, a climate change skeptic might shrug it off as a bad study, a fluke or a hoax.
So it's not terribly surprising that if you identify strongly as a Democrat, you'd be more likely to believe misinformation that paints Republicans in a bad light, and vice versa. This is also tied to low trust — if you firmly believe the government is untrustworthy, you might be more willing to believe that it faked a moon landing, for example, even if evidence says otherwise.
During the George W. Bush presidency, far more Democrats than Republicans believed that the U.S. government was covering up information about the Sept. 11 attacks. Likewise, surveys have shown that a large share of Republicans believe President Obama is Muslim — in one 2015 survey, it was 43 percent, compared with 15 percent of Democrats.
Even worse, the trend of increased political polarization — like the U.S. has been experiencing for decades — may further boost the motivated reasoning effect, according to a 2013 study.
As people dig in to their positions and identify thoroughly with a particular side, that can make changing misguided beliefs all the harder; our beliefs become linked to who we think we are. As Dartmouth political scientist Brendan Nyhan — one of the foremost researchers in the field of why people believe misinformation — put it in a 2016 paper, beliefs "seem to be closely linked to people's worldviews and may be accordingly difficult to dislodge without threatening their identity or sense of self."
Party identification. Low trust is associated with a readiness to believe conspiracy theories with Democrats and Republicans alike, but the same may not be true for political knowledge. The 2015 study that found that low trust is associated with belief in misinformation also found that high-knowledge conservatives were more likely to believe unfounded theories, while the same wasn't true of liberals.
If that's true, then "it means that conservative politicians and pundits can more readily rely on conspiracies as an effective means to activate their base than liberals," the authors wrote.
Importantly, however, this may be a function of current politics — that is, at a time when a Democratic president has been in office for nearly eight years. Right now, in particular, Republicans far more than Democrats believe they are the "losers" in politics, as Kyle Saunders, one of the study's authors, pointed out in an email.
That may be one reason for the discrepancy in their willingness to believe in conspiracy theories about Democrats — after all, people who believe their team is losing are also more likely to feel angry and frustrated at the government, as the Pew Research Center has found. And, likewise, it might square with the fact that conspiracy theories against Republicans were common while Bush was in office.
Repetition. Simply talking about a rumor — even when you're correcting it — can make people believe it more. In one study last year, MIT political science professor Adam Berinsky presented subjects with news stories about Obamacare "death panels" and the idea that elderly patients would be forced to discuss euthanasia with their doctors.
The stories came in a few different types: Some simply mentioned the rumor, but others mentioned the rumor along with refutations from a nonpartisan source like the American Medical Association or a Democratic or Republican politician.
He found that GOP refutations were the most effective — a Republican senator saying, "No, there is no euthanasia clause in the ACA" would seem to change people's minds more than the American Medical Association. Meanwhile, hearing the rumors alongside a correction from a nonpartisan or Democratic source may have even raised the number of people who accepted the rumors.
Not only that, but he also found that asking people to recall the entire articles weeks later — even articles containing refutations — made them more likely to believe the rumors than if they were asked to recall a minor detail from the articles. That is evidence of what psychologists call "fluency" — the notion that people will find a more familiar idea more believable.
(This means covering particularly loud rumors becomes a dilemma for journalists: Covering it could spread misinformation, but not covering it means not explaining a story to your audience. The best answer, Nyhan wrote in the Columbia Journalism Review, is to label conspiracy theories for what they are, and not give them more room than necessary.)
So how do you get misinformation out of people's heads?
The above list provides a few clues to how people's false beliefs could be reversed ... and it isn't promising. For his part, Berinsky says that he feels he has been "largely unsuccessful" in figuring out how to get people to give up their unfounded beliefs.
One less-than-encouraging trend: Trust in government — as well as a variety of institutions, and even in other people — has plummeted in recent decades. Reversing that kind of a seismic shift in the national psyche could be next to impossible.
Likewise, if polarization does make motivated reasoning worse, that's another big problem. Polarization among congressional members is at its highest point since at least the late 19th century, and polarization among Americans has also grown drastically, a result of an array of factors that seem difficult, if not impossible to reverse: self-sorting; segmented media; the political issues that are important; and the party realignments of the 1960s and '70s.
However, there are a few lessons here: For example, getting the right person to do the fact-checking can make all the difference. Berinsky pointed to McDonalds' decision years ago to get rid of Super Size fries, amid widespread demand for more healthful fast-food options.
"If a business is selling french fries and they're telling you french fries aren't good for you, that's a really credible source," Berinsky said.
That means a Democrat denouncing Sept. 11 "truthers" or John McCain telling a town hall attendee that Obama is not a Muslim would probably be more effective corrections than if a nonpartisan source had made them.
So when former Speaker of the House (and Trump supporter) Newt Gingrich warned Fox & Friends hosts against trusting TV-doctor diagnoses ...
... that might have served to shake a few #HillaryHealth believers.
Journalists can also learn a few useful things from these studies. Repeating a rumor too much can make it stick, for example. Furthermore, using graphics in a fact check seems to make it stick more easily, as Nyhan has found.
The broader picture
Political conspiracy theories are nothing new in the U.S. — just look to Richard Hofstadter's classic 1964 essay on the "paranoid style in American politics" for proof. He lists our country's long history of believing in shadowy forces — in the 1700s and 1800s, conspiracy theories about Masons and the Illuminati were common, for example.
Still, there's some evidence — if anecdotal — that the problem is getting worse. In a December article on conspiracy theories, Vox's Dave Roberts pointed to a quote from Republican Rep. Devin Nunes, as reported by Ryan Lizza:
"The overwhelming majority of [Rep. Nunes's] constituent mail is now about the far-out ideas, and only a small portion is 'based on something that is mostly true.' He added, 'It's dramatically changed politics and politicians, and what they're doing.' "
Belief in rumors and conspiracy theories appears shallow, but widespread — it's not that there are a lot of Americans who fear chemtrails, wear tinfoil hats, and believe that mysterious cigarette-smoking men are hiding the truth about UFOs. In that 2015 paper, Berinsky presented subjects with seven rumors, and found that only 5 percent of subjects believed all seven. However, each subject believed 1.8 on average.
"It's not that there are some people who believe a lot of crazy things," Berinsky has said. "There are a lot of people who believe some crazy things."
All this means politicians can usually rely on at least some people to believe and spread misinformation. That means some exploit that power (and will continue to do so). Reversing all this misinformation? That ... well, that could take a while.
"All in all, this would appear to be a difficult cycle to break as identities have become even more hardened over time," Saunders said. "Things can change, and the feedback loop can be escaped, but it will be a long, slow process unless dramatic change occurs on the American political landscape."
Copyright 2021 NPR. To see more, visit https://www.npr.org.