Monday, May 21, 2012

motivated reasoning

We're all familiar with the kind of biased perception that occurs in players and their fans during sporting events. Everybody sees the same play--say a close play on a runner in baseball trying to beat out a bunt--but one side sees the runner as safe, the other side sees the runner as out. Whatever the umpire sees, his judgment--as well as his ancestral lineage--will be called into question by one side or the other. This kind of biased perception might be called motivated perception because of its similarity to what social psychologists call motivated reasoning.

Motivated reasoning takes confirmation bias to the next level. Ordinary confirmation bias makes it cognitively easy to recognize data that support what we already believe. And ordinary confirmation bias makes it difficult for us to perceive data that disconfirm what we believe. Motivated reasoning takes disconfirming data and turns it into confirming data. For example, in a study of "30 committed partisans during the U.S. Presidential election of 2004," Drew Westen et al. "presented subjects with reasoning tasks involving judgments about information threatening to their own candidate, the opposing candidate, or neutral control targets." Even though the evidence was made up and presented equally to the partisan subjects, they took evidence against their own candidate and made it favorable and they took evidence favorable to the opposing candidate and made it unfavorable. Other studies have found something similar: when we have a strong emotional commitment to a belief, we don't just dismiss disconfirming evidence, we rationalize it and twist it so that it becomes confirming evidence. This all happens, of course, at the unconscious level. Consciously, we think we are being objective and unbiased in our evaluations. Psychologists call this the illusion of objectivity.
Consider the following research on the death penalty. People who either supported or opposed capital punishment on the theory that it deterred crime (or didn’t) were shown two phony studies. Each study employed a different statistical method to prove its point. Let’s call them method A and method B. For half the subjects, the study that used method A concluded that capital punishment works as a deterrent, and the study that used method B concluded that it doesn’t. The other subjects saw studies in which the conclusions were reversed. If people were objective, those on both sides would agree that either method A or method B was the best approach regardless of whether it supported or undermined their prior belief (or they’d agree that it was a tie). But that’s not what happened. Subjects readily offered criticisms such as “There were too many variables,” “I don’t think they have complete enough collection of data,” and “The evidence given is relatively meaningless.” But both sides lauded whatever method supported their belief and trashed whatever method did not. (Mlodinow, Leonard. 2012. Subliminal: How Your Unconscious Mind Rules Your Behavior. Kindle Locations 3800-3809. Random House, Inc.. Kindle Edition.) 
This is disconcerting. We think we're unbiased and objective, but unconsciously we are being driven to evaluate data to confirm what we already believe and further disconfirm what we already believe is wrong, regardless of the nature of the evidence.

In my view, there are two main ways that motivated reasoning manifests itself: by selective use of evidence and by giving improper weight to various kinds of evidence. I discuss this problem in chapter eight ("The Fallacy-Driven Life") of Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed! Many times, these two drivers work together. For example, many anti-vaccination folks not only give more weight to anecdotal evidence than to scientific studies, they ignore all anecdotes that are contrary to their beliefs. The only evidence some people need to convince them that vaccinations cause autism is that their child or some child they've read about was diagnosed with autism some time after getting a vaccination. They not only ignore the scientific studies that have found no link between vaccinations and autism, they also ignore both the individual cases of children who were vaccinated and never developed autism and the individual cases of children who were not vaccinated but developed autism anyway.

People who make a living claiming to get messages from the "spirit" world depend on believers ignoring both individual errors from so-called psychics and scientific studies that fail to confirm psychic abilities. There are also those who will appeal to scientific studies to support their belief in psychic powers, regardless of the quality of those studies; these believers will also ignore all the studies that don't support their beliefs.

Young Earth creationists (YECs) provide an excellent example of motivated reasoning. To maintain their position, YECs must reject nearly all science and confabulate new laws of nature and rules of logic and evidence, and subject themselves to ridicule for their willful ignorance and irrational adherence to the myths of an ancient, pre-scientific people. Anti-evolutionists who accept that the universe is billions of years old are another example of motivated reasoning, though their rationalizations need not be nearly as convoluted as those of the YECs.

Anthropogenic global warming deniers demonstrated motivated reasoning when they put more weight in the views of 31,000 scientists--few of whom were climate scientists--than in the views of the vast majority of climate scientists. It would not take much investigation to find out that what motivates the deniers is not the evidence but their political and economic beliefs. (Here we are not talking about disagreements over policy, but over whether human behaviors and practices are largely responsible for global warming.)

Nobody is immune to motivated reasoning. Worse, it is often accompanied by an attitude of mistrust regarding the motives of those who disagree with us. Combine motivated reasoning with our own sense of being unbiased and objective, while being sure that our opponent is biased and not objective, and you have a recipe for predictable obstinacy. It's amazing anybody ever changes his mind about anything he feels strongly about! Yet, it happens.

Psychiatrist Dr. Robert Spitzer (b. 1932), a proponent of "reparative therapy," the treatment of homosexuals aimed at changing their sexual orientation, changed his mind about this highly emotional issue. Spitzer not only changed his mind, he issued a public apology to the gay community:
I believe I owe the gay community an apology for my study making unproven claims of the efficacy of reparative therapy. I also apologize to any gay person who wasted time and energy undergoing some form of reparative therapy because they believed that I had proven that reparative therapy works with some “highly motivated” individuals.
In 2001, Spitzer delivered a paper before the American Psychiatric Association (APA) and claimed his study of 200 homosexuals found that 66% of the men and 44% of the women had achieved "good heterosexual functioning" through therapy.* The APA officially disavowed the paper and when it was published in the peer-reviewed journal Archives of Sexual Behavior in 2003, the paper was heavily criticized in the psychiatric community for its sampling method and for the criteria used to measure success. Spitzer now (April 2012) admits that his critics were right. His apology was published in the same journal as his original paper. In part, he writes:
The Fatal Flaw in the Study – There was no way to judge the credibility of subject reports of change in sexual orientation. I offered several (unconvincing) reasons why it was reasonable to assume that the subject’s reports of change were credible and not self-deception or outright lying. But the simple fact is that there was no way to determine if the subject’s accounts of change were valid.
So, it is possible to change one's mind about issues one has a deep emotional investment in. Why some people change their minds when they re-examine the evidence while others are determined to die with their biases on is puzzling. It is doubtful the answer will come by doing fMRIs on the Robert Spitzers of the world and their new opponents. In any case, we know that some people do change their minds about highly emotional issues and that many people are on the fence about many such issues. This should give hope to those of us who engage in public argumentation on these kinds of topics that our efforts are not in vain.

2 comments:

  1. Motivated Reasoning::Conditioned Expectations

    ReplyDelete
  2. I love the way you write and share your niche! Very interesting and different! Keep it coming!

    ReplyDelete