People don’t like being told they’re wrong. Most of us want to believe what we like, and we fall prey to various forms of confirmation bias in the hopes of sheltering our tender ego. One these phenomena that happens is belief perseverance: “the persistence of one’s initial concepts, as when the basis for one’s belief is discredited but an explanation of why the belief might be true survives” (Social Psychology, Myers).
Mass media can lend itself to such incidents regularly. Bad journalism happens when news organizations rush to release incomplete and even inaccurate information to “break” the story. Meyers uses the example of a study which demonstrated how misinformation persisted in the memories of Americans. When we hear something in the news, if we feel it’s something that we can explain and understand, and if we aren’t skeptical to begin with, we’re more likely to retain that misinformation and even adhere to it.
The researchers also classified people as sceptical if they disagreed with the official reason given for war, ridding Iraq of weapons of mass destruction (WMDs).
The results showed there were far fewer sceptics in the US than in Germany and Australia. And that such sceptics were less likely to believe statements that they knew had been retracted than those people classified as non-sceptical.
Most people in Germany and Australia opposed the Iraq war in the first place. But non-skeptics in America were more positive about news on the war. As Lewandowsky said, “People do not discount corrected information unless they are suspicious about it or unless they are given some other hypothesis with which to interpret the information.”
Alternative frameworks and points of view become important for critical thinking – not just for summary opinion, but as part of the process by which we form those opinions. But this isn’t how we address that problem – for the most part, we depend on objectivity and a supposed lack of bias. But this is impossible. Still, a slavish devotion to that illusion creates a uniformity among most news organizations (Uniformity in message control is also an extremely effective way of managing propaganda campaigns).
Lord, Lepper and Preston (1984) found that ”the cognitive strategy of considering opposite possibilities promoted impartiality.” Myers also points out that explaining why opposite theories might be true addresses belief perseverance positively. Even imagining any alternative outcome will help people in solving their belief perseverance. (Hirt & Markman, 1995; Anderson & Sechler, 1986)
This means that news organizations are actively remiss in not pursuing ”alternative outcomes” or other hypotheses by submitting to existing frames. The abstraction of providing those differing scopes and shielding the public from misinformation is apparently not worth the effort it takes to invest added effort to each story that runs this risk. Or, from an even more cynical perspective, it interferes with message control.