Sports fans have been thrilled to learn that Major League Baseball will be back in May.
Okay, that’s false. But if you’re like most people, that false statement will linger in your memory, making you think, in some part of your mind, that baseball might indeed be returning pretty soon. (Sorry!)
The broader phenomenon is something that psychologists call “truth bias”: People show a general tendency to think that statements are truthful, even if they have good reason to disbelieve those statements.
If, for example, people are provided with information that has clearly been discredited, they might nonetheless rely on that information in forming their judgments. Similarly, people are more likely to misremember, as true, a statement that they have been explicitly told is false than to misremember, as false, a statement that they have been explicitly told is true.
It follows that if you are told that some public official is a liar and a crook, you might continue to believe that even after you learn that she’s perfectly honest. And if you are told that if you’re under the age of 50, you really don’t need to worry about the coronavirus, you might hold onto that belief, at least in some part of your mind, even after you are informed that people under 50 can get really sick.
The underlying problem goes by an unlovely name: “meta-cognitive myopia.” The basic idea is that people are highly attuned to “primary information”: whether the weather report says that it is going to be cold today, whether a candidate for public office claims that he was a war hero, whether the local newspaper reports that a famous television star committed a drug offense.
By contrast, we are less attuned to “meta-information,” meaning information about whether primary information is accurate. If you are given a clear signal that the supposed weather report was a joke, or that a public official is distorting his record to attract votes, you won’t exactly ignore the signal. But if you’re like most people, you will give it less weight than you should.
A powerful recent demonstration of truth bias, published last week, comes from Oxford University’s Myrto Pantazi, along with Olivier Klein and Mikhail Kissine, both of the Free University of Brussels.
The researchers gave a large number of participants information about legal cases involving criminal defendants. Participants were told that some information bearing on sentencing decisions was false. They were asked to come up with an appropriate prison term and also to say how dangerous the defendant was. The main question was whether people would adequately discount information that they were told was false, so that it would not influence their judgments.
The answer is that they didn’t.
When people received negative information about the defendant, they were influenced by it, even if they were explicitly informed that it was false. As the authors put it, “Jurors may judge defendants based on evidence that they encounter, even if they clearly know this evidence to be false.” Consistent with other research, the authors also found that their participants tended to misremember false evidence as true — and did so more often than they misremembered true evidence as false.
The authors did the same experiment with professional judges. Amazingly, they obtained the same basic results. Even if you are an experienced judge, false information about a criminal defendant might well affect your conclusions — and you might well remember it as true. Negative information in particular puts a kind of stamp in the human mind, and it isn’t easy to remove it.
There are large implications here. In the courtroom, judges and lawyers often think that it’s okay to allow untrue or inadmissible information to get before the jury, so long as the judge can say, at the appropriate time, “Jurors, you should ignore that information.” Often, jurors will disobey that instruction, even if they try to follow it.
In politics generally, and in terms of public health issues, there are a lot of false statements out there (including many from President Donald Trump). Newspapers, magazines and social media platforms often emphasize that such statements can be and often are corrected. They hide behind that comforting fact.
It’s literally true. But in terms of how the human mind works, it’s misleading.
With respect to demonstrably harmful falsehoods, there’s an increasingly strong argument for this conclusion: It’s much better not to circulate them in the first place.-Bloomberg