Sampling often seems to be an afterthought with clients as many simply state they want a ‘nationally representative sample.’ The question is what does the client mean by a nationally representative sample? One client might think it means representation on age and gender only, while another might expect it to include controls on additional variables like region, income, education, etc.
A dozen years ago a debate raged in the marketing research community over the switch from probability sampling methods such as telephone RDD to nonprobability sampling methods as are typical with online access panels. In the interim years, most clients moved to online samples but there are still some that cling to probability methods. However, we now see the quality of probability samples being questioned because of low response rates for RDD. In an interesting twist, the very same techniques that nonprobability samples use to weight and model data now often need to be done on probability samples to account for nonresponse bias.
Research has consistently shown that all panels are not the same. Recruitment sources and management practices vary, and this can cause differences among panels. Beyond panels, there are other sources of online survey respondents, such as river, dynamic, and social media sources – and these can produce data that is different from each other, as well as different from panels. Given the wide variety of sample sources, and their benefits and drawbacks in cost and quality, researchers often struggle with the question, “How can I blend in other sources without impacting my data?”
Many clients include quality checks in surveys to make sure respondents are engaged and are answering honestly. However, many of these checks identify false positives, which often mean valid, engaged respondents are thrown out of the sample. How can we reduce false positives?
Do you think like a respondent?
Poor quality survey design leads to low completion rates, high dropout rates, speeding, suspicious behavior, panel attrition and higher sample costs. Ultimately poor design can lead to bad business decisions. Mobile may finally force better survey design and better-written questions.
The Marketing Research Shared Interest Group (SIG) of the Cincinnati American Marketing Association meets monthly to discuss industry issues, growing trends, techniques and methodologies. During the February meeting, Brian Lamar from EMI Research Services led a great discussion across multiple industry topics. One common thread across all key points: clients.
Everyone hates data transitions, but sometimes they are necessary. In most of the world, marketing research has undergone the transition to online from either telephone or face to face. When these transitions happen, we typically experience data differences, some of which can be measured, calibrated and explained while in other situations we are less able to explain the root cause.
There is always debate on what makes a great researcher. Many focus on more traditional research skills such as questionnaire design, data analysis, and presentation skills. However, I think what truly turns a good researcher into a great researcher are critical thinking skills.
Topics: Research Skills
On a monthly basis, the Cincinnati AMA Marketing Research Shared Interest Group meets to discuss industry issues, trends, techniques and methodologies. During the January 2015 meeting, we debated the group’s burning research questions.
Topics: Market Research Trends
According to change management theory (Burke, 2014) the nature of change can be either evolutionary or revolutionary. Evolutionary change consists of incremental changes and doesn’t necessarily change the whole structure or system. Revolutionary change is more radical and is often described as a jolt to the entire system.