Most Read
Most Commented
Read more like this
mk-logo
News
Clarification of Invoke's survey methodologies, response rate

COMMENT | Associate professor Mohd Azizuddin Mohd Sani of Universiti Utara Malaysia offered a critique of our latest Invoke Centre for Policy Initiatives (I-CPI) survey, as reported in Malaysiakini on July 26, 2017. We would like to address his concerns about our response rate and the representativeness of our surveys.

Azizuddin cited the fact that many whom we contacted did not answer our survey questions as a key reason why the survey finding is not reflective of the actual level of support for BN, Pakatan Harapan and PAS.

But, a low response rate is not necessarily inaccurate.

I-CPI would like to take this opportunity to share the nitty gritty of doing a poll so that the public understands why we decided to publish three important information that are not disclosed by any other polls conducted so far.

For every survey finding that is made available to the public, I-CPI publishes the number of voters actually contacted, the number of voters who answer and the number of voters who stay on the line to answer all the questions.

For the latest survey that was disputed by Azizuddin (photo), our proprietary computerised polling system called 2.5 million voters randomly, of which 160,761 voters actually picked up the call. Out of that, in the end, 17,107 voters completed the survey and their responses formed the data for the finding that was shared with the public.

This gives a response rate (the number of people who pick up the phone, instead of the call going directly to voice mail or remains unanswered) of 6.54 percent and a completion rate (the number of those who pick up the call who went on to answer all the questions) of 10.64 percent.

The reason why we publish the population information (no of voters called, no of voters responded and no of voters who completed) is for the public to be able to benchmark our finding against internationally recognised surveys by comparing the response rate.

We hope other polls emulate this practice as well in the future in order to allow for comparability and reliability.

Therefore, for a poll that cites findings from 1,500 voters, the pollster would have to randomly call typically between 100,000 to 150,000 voters given the average response and completion rate (the completion rate is typically lower when the questions relate to voting preference).

Unlocking Article
Unlocking Article
View Comments
ADS