Saturday, 18 December 2010

Expectation Bias

"Expectation Bias" is a rather fancy way of saying that I have made up my mind on something and will henceforth only see the evidence that supports my view and ignore anything which may suggest there is another view. This is something that investigators are made very aware of during training - indeed, some of us who train investigators, deliberately plant "evidence" to invite students to go the wrong way as a part of their learning experience.

It is a problem in many legal cases, and much of the legislation that now dictates methodologies, acceptable evidence gathering techniques and evidence handling and processing arises from failures by investigators to eliminate their biases from the process. If, as a fire investigator, I go into a fire scene with a predetermined conclusion, I will find only the evidence I expect to find - and misinterpret or discount everything which may point to a different conclusion. It is something researchers studying human cognitive behaviours are very aware of as I was reminded reading an article in the Scientific American recently -

The past few decades of research in cognitive, social and clinical psychology suggest that confirmation bias may be far more common than most of us realize. Even the best and the brightest scientists can be swayed by it, especially when they are deeply invested in their own hypotheses and the data are ambiguous. A baseball manager doesn’t argue with the umpire when the call is clear-cut—only when it is close.

The problem is, the more senior, the more experienced and the more 'acknowledged' you become as an 'expert', the less likely you are to admit to bias or to having made a mistake! This is certainly the case in the scientific field. It also led to one of the most tragic series of miscarriages of justice in the last decade - all because a very respected and emminent Doctor developed a theory, unsupported by any research, that allowed him to declare an enormous number of children were being 'abused,' destroying their parents - many mothers went to ail merely on his evidence of 'abuse' - and sending the children through a truly abusive process of examinations, fostering, being declared delusional when they refused to 'confess' or 'admit' they were being abused and finally removal from their families and friends. The doctor in question got away with it despite mounting evidence that he was totally wrong and it was not until enough of his colleagues got up the courage to stand up in court and show why and how he was wrong that it came to a stop. But the damage has been done and many 'lay' people still believe that garbage this man put forward - because they do not have the expertise or the knowledge to unravel what the press fed them and has singularly failed to correct. Again this 'bias' among senior 'experts' is well known -

Scholars in the behavioral sciences, including psychology and animal behavior, may be especially prone to bias. They often make close calls about data that are open to many interpretations. Last year, for instance, Belgian neurologist Steven Laureys insisted that a comatose man could communicate through a keyboard, even after controlled tests failed to find evidence. Climate researchers trying to surmise past temperature patterns by using proxy data are also engaged in a “particularly challenging exercise because the data are incredibly messy,” says David J. Hand, a statistician at Imperial College London.

Two factors make combating confirmation bias an uphill battle. For one, data show that eminent scientists tend to be more arrogant and confident than other scientists. As a consequence, they may be especially vulnerable to confirmation bias and to wrong-headed conclusions, unless they are perpetually vigilant. Second, the mounting pressure on scholars to conduct single-hypothesis-driven research programs supported by huge federal grants is a recipe for trouble. Many scientists are highly motivated to disregard or selectively reinterpret negative results that could doom their careers. Yet when members of the scientific community see themselves as invulnerable to error, they impede progress and damage the reputation of science in the public eye. The very edifice of science hinges on the willingness of investigators to entertain the possibility that they might be wrong.

The more I have learned about the edifice that has been built around Global Warming, the more I become alarmed at the fact that the scientists behind it seem to be very selective in their "evidence" and in their "models." I fully understand the desire to find the solutions to the very complex problem that climate change presents, but I cannot escape the feeling that there is now a huge amount of "Expectation Bias" at work in the IPCC, Greenpeace, East Anglia University and other "Climate Research" bodies. Data is manipulated in ways that are simply not acceptable in my field at all - and would be thrown out of any court if presented as 'evidence.' But that is the problem, the only 'court' looking at any of this is one stuffed with 'believers' who would not dream of questioning the 'expert' or challenging his view.

Sadly, unless this changes, science will ultimately be the loser as the Scientific American article says.

No comments:

Post a Comment