Site Meter

Wednesday, May 16, 2012

The CBS/NY Times pollster reinterviewed (most of) the people interviewed in a poll in April. This time the poll shows a shocking non Rasnussen Romney lead of 3% not a boring tie. No evidence of an actual shift in one single person's actual opinion was presented at the CBS web page (unless I missed it).

I read the CBS write up of the poll including the click for full reasults) and didn't find the key datum - which candidate did the May respondents support in April. The re-interview appears to show a very smallmshift to Romnay from tied in the fist interview in April to up 3% now (already so tiny that the news is "almost no news here). But the May sample is, of course, smaller than the original April sample, as some April respondents could't be contacted and some who were contacted refused to participate a second time. It should be very easy to look at what those who participated again in May said in April. Is the shift to Romney ispite of more attrtion of Romney supporters ? Partly due to more attrition of Obama supporters ? More than all due to greater attrition of Obama supporters ? I sure would like to know (although 3% is still just 3% -- the April sample was clearly unusually good for Romney).

I might have missed the number I want in the write up, of course.


My guess is that the number of people who say they changed there mind is tiny and the shift of 3% is almost exactly all due to not 100% successful resampling (that is panel attrition). This is embarrassing to pollsters for at least one of two reasons. First if responses correspond to actual voting, it suggests that there is little news in new polls (rather for normal polls new noise froindependent sampling). This would suggest an avrage of recent polls is better than the latest poll, as it is. Not good that the news is just anoher number to average. But the alternative is much worse. Few people saying they change their mind is a typical pattern called anchoring. If responses are not like the actual voting of the huge majority which is not polled, then pollsters are in deep trouble. If the interaction of respondent and canvasser matters ( as it does) polls can be completely misleading. A low number of reported revisions of intentions would draw attention to two big problems for pollsters and show that at least one is very big. I am suspicious enough to suspect that the omission of the key info (again if I didn't miss it) is related to pollster self interest.

No comments: