Saturday, July 17, 2010

Researchers hoist with own petard?

In a recent post I took note of the publication of new research purported to support the notion that partisans reject information that contradicts strongly held beliefs more readily than do non-partisans.  I've identified and read the study by Brendan Nyhan and Jason Reifler.

The verdict?  Read on.

One of the instantly striking aspects of the study, titled "When Corrections Fail: The Persistence of Political Misperceptions," was the redefinition of "misperception" employed by the authors:
(W)e define misperceptions as cases in which people’s beliefs about factual matters are not supported by clear evidence and expert opinion—a definition that includes both false and unsubstantiated beliefs about the world.
That definition occurs in a section devoted to the case for redefining the term for purposes of the study (pp. 304,305).  I suggest to the researchers that a term redefined to reflect that "factual matters that are the subject of contemporary political debate are rarely as black and white as standard political knowledge questions" will result in an increased ambiguity in the finished results where the redefined term occurs.  In particular, if news reports seize on the term "misperception" without mentioning the spin placed on the term by Nyhan et al., the news report will end up producing or reinforcing an unsubstantiated belief in the reader.  Which we might call a "misperception" regardless of whether it contradicts the expert opinions of Nyhan and Reifler.  Clear evidence, after all, constitutes a requirement according to their definition.

The study provides fairly clear evidence that Nyhan and Reifler were "hoist by their own petard."


The basic aim of the study was to examine the effects of (clear) evidence that contradicted the established beliefs of political partisans.  Nyhan and Reifler hypothesized that partisans might be less likely to change their beliefs based on contradictory information.

The big problem?  The key findings were supposedly supported by cases where the existing beliefs were not clearly contradicted.  And the researchers apparently had difficulty discerning the problem.  Take, for example, the key exhibit of WMD in Iraq (from the Appendix):

Study 1 (WMD): News Text

Wilkes-Barre, PA, October 7, 2004 (AP)—President Bush delivered a hard-hitting speech here today that made his strategy for the remainder of the campaign crystal clear: a rousing, no-retreat defense of the Iraq war.

Bush maintained Wednesday that the war in Iraq was the right thing to do and that Iraq stood out as a place where terrorists might get weapons of mass destruction.

‘‘There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks, and in the world after September the 11th, that was a risk we could not afford to take,’’ Bush said.

[Correction]

While Bush was making campaign stops in Pennsylvania, the Central Intelligence Agency released a report that concludes that Saddam Hussein did not possess stockpiles of illicit weapons at the time of the U.S. invasion in March 2003, nor was any program to produce them under way at the time. The report, authored by Charles Duelfer, who advises the director of central intelligence on Iraqi weapons, says Saddam made a decision sometime in the 1990s to destroy known stockpiles of chemical weapons. Duelfer also said that inspectors destroyed the nuclear program sometime after 1991.
 The elements in the story prior to the correction make for a complex statement.  The author of the story establishes a relatively strong implication that Bush said Iraq possessed WMD.  Bush made no direct reference to Iraqi possession of WMD, let alone stockpiles of WMD.  In short, the news story left considerable ambiguity regarding the picture of Iraq's WMD.  The report describes what test subjects were to do with the information:
After reading the article, subjects were asked to state whether they agreed with this statement: ‘‘Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.’’16 Responses were measured on a five-point Likert scale ranging from ‘‘strongly disagree’’ (1) to ‘‘strongly agree’’ (5).17
Not only did the study's news story contain a complex statement, the study question asked subjects to rate their agreement with a differing complex statement.  Suppose the subject believed that Iraq possessed an active program to produce weaponized microbes but dissolved the program by the time of the invasion.  Nothing in the correction would contradict that belief, and that belief ought to count on rating scale test subjects were asked to complete.  Even the idea of large WMD stockpiles prior to the time of the invasion resists contradiction in the text of the correction.

As a credit to Nyhan and Reifler, they recognized many of the problems they faced in collecting legitimate data.  None the less, they chose a correction that failed to correct in the case of Study 1, and that probably accounts more than anything for the disparity of results they noted with respect to Study 2:
Model 1 indicates that the WMD correction again fails to reduce overall misperceptions. However, we again add an interaction between the correction and ideology in Model 2 and find a statistically significant result. This time, however, the interaction term is negative—the opposite of the result from Study 1. Unlike the previous experiment, the correction made conservatives more likely to believe that Iraq did not have WMD.
(yellow highlights added)
The researchers added this comment following the observation above:  "It is unclear why the correction was effective for conservatives in this experiment."

Is it?  As we did with the former "correction," let's examine whether the correction was specific to the belief the study was testing:
 
Study 2, Experiment 1 (WMD): News Text

[New York Times/FoxNews.com]
December 14, 2005

During a speech in Washington, DC on Wednesday, President Bush maintained
that the war in Iraq was the right thing to do and that Iraq stood out as a place where
terrorists might get weapons of mass destruction.

‘‘There was a risk, a real risk, that Saddam Hussein would pass weapons or
materials or information to terrorist networks, and in the world after September the
11th, that was a risk we could not afford to take,’’ Bush said.

[Correction]

In 2004, the Central Intelligence Agency released a report that concludes that
Saddam Hussein did not possess stockpiles of illicit weapons at the time of the U.S.
invasion in March 2003, nor was any program to produce them under way at the
time.

[All subjects]

The President travels to Ohio tomorrow to give another speech about Iraq.
Study 2, Experiment 1 (WMD): Dependent Variable

Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program and large stockpiles of WMD.
The last sentence represents the proposition with which respondents were to agree or disagree.  Even in this case the "correction" fails to completely contradict the test proposition, but clearly it is much closer than with Study 1.  The logical wiggle room is scant, occurring mostly in the interpretation of "immediately."

The study included two other cases.

The second of the three featured a supposed correction of the notion that tax cuts increase tax revenue.  They do, in a way, but there is no good evidence that the revenue increases beyond what it would have been with a higher tax rate, at least in the U.S. context.  Note the supposed correction:
[Correction]

However, even with the recent increases, revenues in 2005 will remain well below previous projections from the Congressional Budget Office. The major tax cut of 2001 and further cuts in each of the last three years were followed by an unprecedented three-year decline in nominal tax revenues, from $2 trillion in 2000 to $1.8 trillion in 2003. Last year, revenues rebounded slightly to $1.9 trillion. But at 16.3 percent of the gross domestic product, last year’s revenue total, measured against the size of the economy, was the lowest level since 1959.
Not exactly a robust contradiction, is it?  Rather than contradicting the key proposition, the above allows for numerous factors aside from tax cuts to influence revenue.

The third of the three was the best contradiction of the lot, at least in my skewed estimation.  This one was intended to flip things around so that liberals would have the opportunity to reverse their beliefs.  Did Bush ban stem cell research?
[Correction]

However, experts pointed out that Bush’s action does not limit private funding of stem cell research. He is actually the first president to allow the use of federal funds to study human embryonic stem cells, but his policy limits federal support of such research to colonies derived from embryos already destroyed by August 2001. Study 2, Experiment 3 (Stem Cell Research): Dependent Variable President Bush has banned stem cell research in the United States.
Even in this case, it is possible for a liberal to plausibly reason that limiting federal funding is a type of ban.

The conclusion:
While our experiments focused on assessing the effectiveness of corrections, the results show that direct factual contradictions can actually strengthen ideologically grounded factual beliefs—an empirical finding with important theoretical implications.
The conclusion contains a germ of truth.

The study barely succeeded in providing any "factual contradictions" even in the best of the three cases.  But the "backfire effect" where respondents react to new information that requires adjustment of existing beliefs does qualify as an important finding.

I expect that perceived success in resolving the intellectual challenge may result in an increased confidence in a belief.

Finally, it can't be avoided that Nyhan and Reifler's participation in conducting the study constitutes an experiment somewhat parallel to the one they sought to conduct.  The researchers entered inquiry mode with a hypothesis to test, likely one that held a certain degree of intellectual appeal.  They collected evidence--dubious evidence, that is--and more or less concluded that the evidence supported their hypothesis.

We are to suppose, no doubt, that "experts" are at least somewhat immune from having their ideology influence their goal-directed interpretation of information.

With a vigorous and diverse peer review process, they may be right.

But bring a skeptical eye.  Peer review, perhaps the majority of the time, isn't always what it is supposed to be.


Afters:

It occurred to me while pondering the Nyhan/Reifler re-imagining of "misperception" that belief in Iraq's lack of WMD may well have qualified as a "misperception" based on the assessment of intelligence professionals up through the start of the war.  It was, after all, a "slam dunk" that Iraq had the weapons.  At some point during the war in Iraq, things would have had to shift to the point where the belief in Iraq's possession of WMD became the misperception.

I support in principle the right of authors, including researchers, to define words however they please.  But using a definition at odds with its normal usage seems like a bad idea for research.  I can imagine that newspapers will have more fun with the stories about the study by using "misperception" without bothering to explain how the term was redefined for purposes of the research.  If that happens, then Nyhan and Reifler may end up making us all a shade more stupid with the help of the mainstream media.


July 17, 2010:  Clarified opening sentence with the addition of "more readily," supplied the "d" at the end of "increased," and added a missing "the."
July 22, 2010:  Removed a redundant phrase

No comments:

Post a Comment

Please remain on topic and keep coarse language to an absolute minimum. Comments in a language other than English will be assumed off topic.