- Special Projects
Hawaii’s political polls are regularly erratic and wrong. As a recent Washington Post headline put it, Hawaii is a place “where good polling goes to die.”
One glaring example. Both the Civil Beat and Star Advertiser polls far underestimated the size of David Ige’s 2014 governor primary victory over Neil Abercrombie, just as they had far underestimated the size of Abercrombie’s primary victory over Mufi Hanneman four years earlier.
Pollsters often describe their difficulties here in ethnic or cultural terms. Those doggone pesky local Japanese voters just won’t cooperate. Local people don’t like pollsters with Mainland accents. They don’t respond. That’s why things often go South after the final election poll. What can you do?
Hawaii is notoriously difficult to poll. That’s become a mantra.
But if your plumber tells you that she can’t fix your leak because the broken pipe is notoriously difficult to plumb, that is bad plumbing.
By the same token, saying over and over again that Hawaii is notoriously hard to survey is bad polling.
The arguments about the quality of Hawaii’s polls, the back and forths between Civil Beat’s and the Star Advertiser’s polls have a stale, unproductive old-school feeling to it.
Mathew Fitch and Seth Rosenthal of Merriman River Group, the pollster that Civil Beat uses, recently offered a very useful analysis of its recent polls, but this analysis approached Hawaii’s polling challenges in the same old ways.
It is time for some fresh thinking, and because of recent developments in polling and survey research there is plenty fresh to think about, which should include something like this:
“Join You Gov today to take part in surveys like these and earn money…”
This is not some come-on for a rigged partisan push poll or a trivial PR gimmick. It’s the tease on the website for YouGov, a highly successful, non-partisan polling operation that is the on the cutting edge of political polling.
This kind of polling is called internet opt-in polling because all the respondents in a survey take the initiative to participate, and all the surveys are done on the web.
Its huge database (about 100,000) is composed entirely of people like me who have decided hey, why not?
“Neal,” a recent e-mail sent to me said, (I had opted in a few days before), “you’ve been selected for s special survey about the 2014 election … It’s very short.”
It went on to say the poll is being conducted for a national news program and offered me 500 points (to be redeemed as a gift) if I actually took the survey, which I did.
It’s become much harder for conventional polls to get their randomly selected respondents to respond.
This is definitely not the conventional polling vibe. No one chose my name randomly in advance. No one interrupted my dinner with a phone call. Plus I got points toward a freebie, just like frequent flyer miles but without annoying blackout dates.
When I learned how to do surveys back in the days when the computer card-sorting machine in Jack Webb’s “Dragnet” was cutting edge, such self-selection meant sample bias, and sample bias was the poison pill that destroyed the integrity and objectivity of any survey.
Pollsters use probability sampling as their key weapon against self-selection and bias. The pollster rather than some old guy like me with time on his hands decides who will participate by using some kind of random selection.
But times have changed. It’s become much harder for conventional polls to get their randomly selected respondents to respond.
Furthermore, phone changes have also complicated matters. Fewer people have landlines, it‘s tricky sampling cell phone users, especially those who have no landlines and, cell phone user demographics differ from landline users.
Typically what comes out of this mix is a small, biased, non-representative sample that is technically not self-selecting but in fact really is.
Polling companies use some pretty effective statistical methods and some elegant assumptions to adjust for this, but manipulation of the data is still a far cry from the old gold sampling standard.
YouGov of course has a big self-selection problem from the get-go because it does not even begin by generating a representative sample. It deals with this challenge in several ways.
First, YouGov generates a huge poll of potential respondents
Second, YouGov takes this huge panel and matches the panelists’ demographics with demographically similar respondents based on data from the U.S. Census’s nationwide American Community Survey, which is indeed a rigorous probability sample.
So both old-school and new-school polling share the same gold standard, both encounter society-wide obstacles to achieving these standards, and both use elegant assumptions and statistical techniques to mitigate these obstacles as best as possible.
These traditionalists are impeding progress by running on the same old hamster wheel.
Yet the vibes and approaches accompanying the two kinds of polls are so different.
Conventional polls like those in Hawaii seem flawed yet solid and familiar, kind of like a longtime husband.
Opt-in polling, on the other hand, seems so, well, Facebook, kind of like the way that husband sees his daughter’s prom date.
Those differences plus the money and prestige involved get to the heart of the controversy between conventional pollsters and sites like YouGov.
AAPOR accused the Times of disregarding polling standards and embracing “new approaches without an adequate understanding of the limits of these nascent methodologies.”
Nascent. Why can’t they be like we were, perfect in every way. What’s the matter with kids today?
But to the advocates of opt-in polling, it’s these old guard pollsters who use bad methods because these traditionalists continue to use approaches that simply do not work any more. These traditionalists are impeding progress by running on the same old hamster wheel.
There is good solid evidence that in fact the baby is doing just fine.
A recent study reported in Political Methodology, a highly respected journal but not likely to be found anywhere you get your hair cut or your teeth cleaned, examined three identical national polls done at the same time using three different samples: a phone sample generated by the traditional randomization methods; a mail survey also based on a pre-designated representative sample; and an opt-in internet sample.
The results of the three samples were essentially the same. The opt-in poll did just as well as the others.
Other studies have found much the same thing.
So the New York Times decision is the right combination of innovations and prudence: use opt-in methods but also continue to use other forms of polling.
In that light, Hawaii’s polling needs to change in several ways.
First, there should be an internet opt-in poll for Hawaii, a mini-YouGov. Keep in mind that this kind of polling is cheap and quick. Let’s try it and see what happens.
Second, Hawaii’s pollsters have to do more experimentation and assessment. It’s an axiom for good research that the greater number of ways you use to measure something, the more confident you can be of the results.
In that vein, the Political Methodology study is a gold standard.
But most of all, Hawaii’s conventional pollsters need to realize that society’s increasing reliance on social media and other new forms of communication have profoundly and permanently changed the rules.