Scripps poll: One-third see U.S. hand in 9/11
Thomas Hargrove and Guido H. Stempel III, Scripps Howard News Service
August 2, 2006
"More than a third of the American public suspects that federal officials assisted in the Sept. 11 terrorist attacks or took no action to stop them so the United States could go to war in the Middle East, according to a new Scripps Howard/Ohio University poll.
"The national survey of 1,010 adults also found that anger against the federal government is at record levels, with 54 percent saying they "personally are more angry" at the government than they used to be.
"Widespread resentment and alienation toward the national government appears to be fueling a growing acceptance of conspiracy theories about the 2001 attacks on the World Trade Center and the Pentagon.
"Suspicions that the 9/11 attacks were 'an inside job' - the common phrase used by conspiracy theorists on the Internet - quickly have become nearly as popular as decades-old conspiracy theories that the federal government was responsible for President John F. Kennedy's assassination and that it has covered up proof of space aliens.
"Thirty-six percent of respondents overall said it is 'very likely' or 'somewhat likely' that federal officials either participated in the attacks on the World Trade Center and the Pentagon or took no action to stop them 'because they wanted the United States to go to war in the Middle East.'
So, this is essentially rough parity with the percentage of the populace who still support Dubya. But as far as I can tell they didn't ask the logical follow-up question: "Do you really care?"Links to whole story: One, Two, Three.
7 comments:
I think this is a very fishy poll. If the Thomas Hargrove who works for SHNS is the same guy who has written a ton of articles about the conspiracies to steal the elections of 2000 and 2004, then there is an obvious question of intent here. If you look at the SHNS website and check out the questions you'll see a classic example of railroading respondents. The sequence of questions leading up to the 9/11 conspiracy question in this survey clearly should have been expected to influence the way people answered. Was this incompetence, or is this another example of a survey whose main intent is to affect public opinion rather than glean it?
Well, here is a link to info about this Thomas Hargrove (a graduate of the University of Missouri), and yes, it does appear that he has written about election issues. However, Guido Stempel is a distinguished academic and director of the Scripps Survey Research Center at Ohio University. He bio doesn't read as if he has an agenda. In regard to "railroading respondents" and "affect[ing] public opinion," I am a lot more concerned about Rupert Murdoch than I am about Thomas Hargrove.
No need to get defensive about it, a bad poll is a bad poll. The point is that these aspects of this study undermine the case it makes. I was surprised to see an organization with the credibility and visibility of Scripps-Howard present poll results that are so misleading.
Didn't mean to sound defensive about it; your points seem well-measured and reasonable - I haven't been able to find a transcript for the questions as asked, which would certainly be helpful for evaluating any prospective bias - please post an URL if you found one. I was most interested in the results as a bell-weather of sorts for a perceived (or imagined) mounting distrust of the current administration's unabashed freeform concept of honesty.
Here is the URL for the questions:
http://newspolls.org/survey.php?survey_id=23
I wrote to Dr. Stempel, and he confirmed that the questions were asked in the same order as they appear at the web site.
It's interesting to read the earlier questions, although the sequence issue would affect some of the responses to these questions, too.
I think more of an attempt to screen out such effects would be apparent if Scripps-Howard in this survey had been seriously interested in finding out people's attitudes.
OK, so I read the questions. I am not sure that one can take such a mechanistic perspective about those earlier questions creating a mindset that would influence the answers to the later questions. Do you have good evidence that this is such a common phenomenon in polls?
And the only take home that I can extract from the questions regarding 9/11 is that for most of the 36% percent, it's not gov't. complicity that they believe in, but failure (perhaps deliberate - arguably a sort of complicity) to act. Look at the %'s for the other 9/11 related questions - less than 20% doubt the official story of both events in NYC & DC.
Any in-depth discussion of the use and abuse of polling data will make the point that the order or sequence of questions has an affect on how later questions are answered. This can happen for many reasons-- because information is "seeded" in the mind of the answerer by earlier questions; by framing the issue early in a survey; by the careful selection of euphemisms and words with strong negative or positive connotations early in a survey, etc.
One can find a helpful short and clear presentation of this and other issues in the form of twenty questions designed for use by journalists at: http://www.publicagenda.org/polling/polling_20q.cfm
Here is the relevant paragraph from that list of questions:
"14. In what order were the questions asked?
Sometimes the very order of the questions can have an impact on the results. Often that impact is intentional; sometimes, it is not. The impact of order can often be subtle. During troubled economic times, for example, if people are asked what they think of the economy before they are asked their opinion of the president, the presidential popularity rating will probably be lower than if you had reversed the questions. And in good economic times, the opposite is true. In political polls, campaign consultants often ask a series of questions about various issue positions of the candidates--or various things that could be said about the candidates. After these questions are asked, the horse-race question is asked, usually for the second time in the poll. This second horserace question is then examined to see if the questions about issues and positions swayed any opinions. This may be a good way to test issues. It is a poor way to test the candidates' true standings in the public's mind. What is important here is whether the questions that went before the important question affect the results. If the poll asks questions about abortion just before a question about an abortion ballot measure, those previous questions could sway the results. "
Finally, for a helpful review of this and other issues related to the use of polling data in political arguments see:
http://www.brookings.edu/press/review/summer2003/loomis.htm
My interpretation of the survey at issue is that the earlier questions highlight anger and cynicism about Bush and government in general before asking a question about whether it is somewhat likely that the administration was somehow complicit in the 9/11 attacks, even if just by omission. It seems to me obvious that the way the question was asked had an influence on the way people tended to answer it.
You can keep defending the objectivity of this survey if you believe it is necessary to do so. Reasonable people can differ on matters of interpretation. My point is that people who take political issues seriously should do their best to avoid framing arguments critical of a government full of liars and misleaders with information that is carelessly gathered and possibly misleading or false. Surveys don't fall from the sky or occur naturally in the environment. They are put together by people, and many choices are made in that process. That's really all I have to say about the matter.
Post a Comment