Thursday 28 August 2014

More information doesn't neccessarily improve our decisions

You have decision to make, you decide to confer with a colleague who is well informed and a good logical thinker, an all-round bright guy. Sounds like just the person you need. Perhaps not.

A study by Nyhan and colleagues at Dartmouth University looked at how people can come to hold false beliefs. The study provided parents with comprehensive information about vaccines. The idea was to counter the false beliefs that some people have about the links between vaccines and autism. The better informed people become, then the more willing they would be to vaccinate.

Interestingly this was not how it turned out. Having people well informed, giving them more objective facts, didn't make people more likely to vaccinate. People just used the additional information to support or rationalise their pre-existing view point. Worse still, it gave them ‘evidence’ to hold their views a little more tightly. They became more polarised and extreme in their opinions.

In one way this is not surprising. It touches on a number of well-known biases such as the confirmation bias, where we take information that suits us and ignore everything else. It also highlights how our thinking is very much guided by our belief systems. We tend to use facts to feed our beliefs.

In another study Lewandowsky and his colleagues at the University of Western Australia had participants read a report about a robbery at a liquor store. Everyone read the same report, but in some cases racial information about the perpetrators was included and in others it wasn't. In one scenario, the suspects were Caucasian, and in another that they were Aboriginal. At the end of the report, participants were told that the racial information was incorrect and should be ignored.

Participants were then asked to recall details of the robbery (type of car used) and also to speculate on aspects of the crime (who may have carried it out, why was violence used).Separately, participants took part in an assessment of racial prejudice against Aboriginals.

All the participants recalled the details of the crime. However the participants who scored highest on racial prejudice continued to rely on the racial misinformation that identified the robbers as Aboriginals, even though they knew it had been corrected. They answered the factual questions accurately (type of car, times, what was taken) but still relied on race when speculating on who carried out the crime.

This is similar to Nyhans study. Providing facts and correcting the record does not change beliefs that easily. It is not that people are ill-informed, it is that they use facts to bolster what they believe.

The point here is that being well informed or getting access to more information may not improve our decisions or choices. We simply follow our beliefs. If we are good logical thinkers then we will be able to neatly organise the facts to create a decent argument as to why our beliefs are right. 

The smarter we are, the better we can weave our supporting argument. This is why we see relatively smart (and in fairness some dumb ones too) people pro and anti global warming, liberal and conservative. They have beliefs that are not shifted by information and can create a logical rationale to support their stance.


Getting back to where we started, rather than choose someone who is very well informed or really smart when conferring, find someone with an open mind (they can still be smart), who does not have strong beliefs on the subject matter in hand. Give them the facts and you may get a more objective appraisal, as long as your own beliefs do not get in the way.

No comments:

Post a Comment