Chris Acker, back on October 20, applied his knowledge and humor to a perplexing problem; how to do polls and get the result you so very much want. Given the election results compared to polls, he turns out to have been precisely correct, which is why I’m temporarily moving this post back to the top of the heap. His observations were uncannily correct as applied to the election and demonstrate much of the media establishment was doing everything it could to suppress the Trump vote. Whether it worked or not, it is crystal clear polling was outrageously bad in every respect.
Businesses, non-profits and political groups routinely seek information from target audiences. To this end they can use polling, surveys and focus groups to gather data. Results can be used objectively or used to promote an agenda. Intentionally flawed methods will generate meaningless data that in turn can be used to lend an air of legitimacy to virtually any position.
In this piece I’ll outline how to manufacture both types. If you are not already, you will likely be skeptical of polls by the end of the article.
Corporations have every incentive to collect accurate data, especially in the area of product development. Poor research can lead to weak product demand and financial loss. When developing a consumer survey it is therefore essential to pose unbiased questions and to associate a cost with each feature or action described. This is true for opinion gathering in both the public and private sectors, more of which later.
In generic and abbreviated terms, here’s an example of a poor product survey:
“Omatron” is developing a new gizmo:
Would you like these features?
Sure! These features sound great. I’ll check “yes” for all, as each comes free.
A more meaningful survey will gauge what is important to potential buyers based on how much they would be willing to pay for each feature. Our improved questionnaire would look something like this:
“Omatron” is developing a new gizmo:
Would you consider purchasing these features at the costs below?
Results from this survey will be dramatically different from the first. Developers will be able to incorporate cost-effective features and offer a product that consumers will actually buy.
Collecting objective, valid and meaningful data is challenging, time-consuming and expensive. It requires specialized skills which are gained by professionals through years of experience.
Now to the fun part – how to get the results you want to promote your favorite cause. An agenda poll, commonly referred to as a “push poll,” works for all – left wing/right wing, radicals/conservatives, fossil fuel haters/fossil fuel lovers. Here’s an example. What makes it especially appealing is that you don’t need much, if any, experience to create an agenda poll. Just follow these simple rules…
1. Ask questions you can’t disagree with:
Headline generated: Vast majority are in favor of clean air and water and against industrial pollution.
2. Begin questions with a false premise:
3. Begin questions with an alarming, false premise and no associated cost:
A more balanced question might ask:
Most energy related push-polls I’ve run across have come from the environmental segment. Headline results are picked up by mainstream media with an objective of tilting general opinion away from fossil fuels and toward a “green revolution” of some sort. Not surprisingly, energy industry research efforts have tended to be more balanced. Perhaps they feel more accountable to shareholders, regulators and the general public not to mention litigious lawyers.
Not to be outdone, the “best” worthless surveys are generated by our political parties. I’m sure you’ve received them…
Your party is looking out for you and will make your life better by doing A, B, C. Despicable opponents want to inflict these awful things X, Y, Z.
Do you agree they are:
___ Terrible ___ Very Terrible ___ Extremely Terrible
Now send us $25, $50, $ Much More
So, here you have it in a nutshell. Good, meaningful data is difficult to extract. It relies on asking relevant, unbiased and well-crafted questions while clearly presenting associated costs. Agenda polls, on the other hand, are relatively easy to construct since they are intended to generate artificial results. This manufactured data, in turn, is used to validate pre-determined answers and to bolster well established positions.
Editor’s Note: I have done numerous citizen survey over the years and everything I’ve learned supports Chris Acker’s observations. There is also extreme danger of bias in who is selected, directly or indirectly, to answer polls. If you send out a survey regarding whether a community should have zoning or not, say, you run the risk of the following:
1. Advocates always being more motivated to answer.
2. Advocates being more accustomed to answering surveys (especially if they’re filled with jargon or are done on-line).
3. Advocates being self-selected as those who have time on their hands because they don’t work.
4. Advocates being second-home folks or retirees who no longer need work and want nothing to change.
5. Advocates being more knowledgeable of the ways of government.
I’ve found seniors (my age or older) are always inclined to dominate the respondents for all these reasons and the result is seldom anything meaningful with regard to the opinion of the folks who have to make a living in the community. The entire concept of a survey or poll is inherently biased against blue-collar folks, who may be very knowledgeable about life, but haven’t the time to engage in the theatrics of politics that consumes nerds like me.