If you got a letter in the mail or a call on the phone from someone who asked whether you “favor or oppose receiving a chocolate cake,” there’s a high degree of likelihood that you’d say, “I’d favor it.” Why? Because chocolate cake tastes good.
The same goes for a caller who wanted to know whether you wanted to receive a sports car, a trip to Bermuda, or, say, the construction of the Mark Clark Expressway along a particular route.
But if you were told that the chocolate cake would cost you $50, would you still be in favor of getting it?
This is just the kind of logic used in the recently announced survey results regarding extension of the Mark Clark Expressway. A survey backed by the state Department of Transportation (SCDOT) asked a simple question but left out a key component — that the roadwork would cost hundreds of millions of taxpayer dollars.
So it’s not surprising that 72 percent of respondents in a “random sample” of 5,000 households from West Ashley and Ravenel to the islands — James, Johns, Kiawah, Seabrook and Wadmalaw — said they’d be for extension of the highway. The hypothetical question they were asked was loaded to favor a positive answer!
The folks at the University of South Carolina who conducted the survey for the SCDOT provided a 19-page report on the methodology on why it is a good survey. But quite simply, it is flawed because it didn’t ask any substantive follow-up questions. It doesn’t take a rocket scientist to know that:
- The two-page survey opened with a color map and a detailed 15-line description of the project’s proposed route. Would it have been so hard to add one line that it is expected to cost $558 million? You don’t, for example, see news stories about the proposed highway that fail to mention the cost. Shouldn’t a survey do the same?
- There should have been more than one question asked to provide more insight into the answer of the first question. Certainly there was space. More than likely if people had been able to respond to a question that the road would actually cost them money, the number of positive responses would have plummeted like a duck shot from the sky.
- In the dark ages when I took graduate-level statistics, mail surveys generally were thought to be imperfect tools for public opinion because a good response rate was considered to be 5 percent of surveys mailed. In the survey for the SCDOT, researchers got a 39.8 percent return rate, which was boosted to 44.2 percent after they followed up with phone calls to people who wouldn’t fill out the paper survey. The response rate alone should be a clue that the survey is suspect.
This survey falls into the category of the kind of document that purports to apply science to a matter of public opinion. But the foundations of the whole survey aren’t worth the paper on which it’s printed. Sure, the math works out and the 19 pages of logic and results sound good at first glance. But when you check under the hood, there’s more than enough to worry about.
Is this a case of elected officials pressuring for a simple survey to get the results they wanted? Why didn’t the survey ask enough questions to get to the root of how people really feel about the proposed extension?
We’ll probably never know. For now, be careful about assuming the results of this survey are an accurate snapshot of what people where the road may be built really think.