The Brain’s Hidden Censor, What It Won’t Let You See


ANSWER THESE two questions quickly: In the biblical story, what swallowed Jonah?  And how many animals of each kind did Moses take on the Ark?

People often answer “whale” to the first question and “two” to the second—even when they know that Noah built the Ark.  What’s called the Moses Illusion demonstrates “knowledge neglect,” that people have relevant knowledge but fail to use it, according to Vanderbilt psychology professor Lisa Fazio.

People are very bad at picking up on factual errors in the world around them even when they know the correct information, and they will go on to use that incorrect information in other situations—one reason “fake news” is so dangerous, Fazio points out.

Vision in particular can be erroneous when the brain’s selective screener makes some things much more visible than others—the brain’s “selective selectivity,” writes Keith Payne in Scientific American.  “The unconscious screener shapes what the conscious ‘you’ gets to see but the conscious ‘you’ doesn’t have veto power over that decision.

Recent research documents show how our personal screeners’ preference for seeing people similar to us actually blocks our ability to see those of other races.  The tragedy: Personal contact is a powerful way to reduce prejudice, but “we cannot get to know or learn from people if we look right through them,” writes Payne.

The Cultural Cognition Project based at Yale strives to understand how people acquire and then stick to false information, as well as what strategies could help them accept accurate—if unappealing—information.

Asking “why doesn’t the mounting proof that climate change is a real threat persuade more skeptics,” Yale law professor Dan Kahan and his colleagues describe the wrong-headedness of the leading theory, the Scientific Comprehension Thesis (a version of the More Information Hypothesis) that the public needs more information and better explanations to arrive at accurate conclusions, Ezra Klein writes on Vox.

There are “some kinds of debates where people don’t want to find the right answer as much as they want to win the argument,” explains Klein. Humans “may reason for purposes other than finding the truth.”  Among those purposes: increasing their status in the community and “ensuring that they don’t piss off leaders of their tribe.”

People aren’t reasoning to get the right answer, but to get the answers they want to be right, Klein writes.  In one of Kahan’s studies on attitudes to climate change, for those people already skeptical, improving their scientific literacy made them even more skeptical.

When statistics are involved, the people who most accurately evaluate most statements of fact are those who are good at math—but only until the issue becomes politicized.  At that point, mathematically skilled people not only come to conclusions based on ideology, but are more likely to do so than those with weak math skills.

Personal biases can be confirmed, even bolstered, by very small numbers of doubting “experts.”  The book and film Merchants of Doubt show how even a single scientist—such as one well funded by the cigarette companies—can rally doubters against an enormous preponderance of scientific evidence, in this case proving that cigarettes cause lung cancer.

In the case of climate change, although some 99% of scientists have found evidence substantiating devastating effects as well as the human role, a handful of experts —including the same scientist who questioned the cigarette-cancer link— provide sufficient, if few, seeds of doubt for deniers to feel comfortable holding onto their incorrect convictions.

Attacking weak links is one weapon used by doubt merchants: find a mistake, exaggerate it and condemn the entire proposition.  Scientists throughout history have made some whoppers. In The Truth About Animals: Stoned Sloths, Lovelorn Hippos, and Other Tales From the Wild Side of Wildlife, Lucy Cooke describes early scientific explanations of what happens to birds that disappear in wintertime, which scientists stood by adamantly. Birds hibernate—underwater; and birds fly away—to the moon.  (They migrate.)

Such erroneous views could prevail, however, only as long as there were too few scientists and resources to test hypotheses—centuries ago, before long-distance travel was possible—compared to today when hundreds, often thousands, of research studies support scientific conclusions.

Another weapon is false statistics. The work of serious climate deniers is “filled with fact and figures, graphs and charts…much of the data is wrong or irrelevant,” writes Klein.  “But it feels convincing.”

Maybe the biggest reason why deniers are unmoved by science is that people innately resist change, more so when urged by voices they consider to be fighting for the other team; and most importantly, when changing their minds could mean distancing themselves from their “tribe.”

An example of someone who risked alienation from an enormous tribe, Mark Lynas wrote two books about the risks of GMOs, but when he re-examined the data and found that he’d come to the wrong conclusions, he went public with his about face. “For a lot of people, it was an ‘Oh fuck’ moment,” Lynas told The Guardian. “They realised they’d been lied to, at a very profound level, by the very people they’d trusted.”

“And what of his worst fear, that he wouldn’t have any friends left at all?  ‘Well,’ he smiles sadly. ‘That’s probably what happened.’”

Even today when 90% of scientists say GMOs are safe, only about one-third of consumers agree. As with the childhood vaccine debate, the origin of the GMO scare was a scientific paper later retracted by the journal that published it due to flaws in method.

Contemporary media fan the flames.  “The rage-fueled tribalism of social media, especially Twitter, has infected the op-ed pages and, to some extent, the rest of journalism. Twitter is about offering markers of affiliation or markers of disaffiliation,” according to journalist Kevin Williamson, recently at The Atlantic.

Kahan is hopeful that, “if researchers can just develop a more evidence-based model of how people treat questions of science as questions of identity, we can use reason to identify the sources of threats to our reason and…devise methods to manage and control those processes,” Klein writes.

For each of us, better understanding our brains’ selective selectivity can spur us to seek what we might have missed, look beyond news reports and check facts.  “One thing that does seem to help us [see errors and falsehoods] is to act like a professional fact-checker,” writes Lisa Fazio.  Professional fact-checkers are, she believes, “one of our best hopes for zeroing in on errors and correcting them, before the rest of us read or hear the false information and incorporate it into what we know of the world.”

—Mary Carpenter

Every Tuesday Mary Carpenter reports on well-being, taking on topics like the pros and cons of homeopathy and the benefits of psychedelic therapy. 


Leave a Reply

Your email address will not be published.