In 2009, the year following the iPhone’s release, eBay conducted $600 million in mobile transactions. Out of the gate eBay’s elegant app minimized friction throughout the acts of buying and selling. Ten years on, market research still struggles to keep pace. Why are links to 30-minute, even one hour plus surveys being emailed to potential respondents? What many refer to as the glacial pace of market research’s migration to mobile is not for lack of trying. Players new and old across the research project service arc have started entire companies, released new products, published parallel test findings and worked diligently to develop tools allowing the process of taking a survey on a smartphone to be as easy as booking a ride on Lyft. Many brand trackers have been redesigned and shortened without any changes to the valuable trend data and normative measures marketers use to make decisions.
As we know from part 3 of our Marketing Data Integration series, there’s an abundance of data available to marketers, from first-party, to second-party, to third-party. We know we can collect it from a variety of sources and connect these data for a more holistic view of our target customers. What we haven’t discussed yet is the integrity and quality of the mass data at our researching fingertips, or the importance data validation plays in the GDPR world we now live and work in. While only live for a few short months, the guidelines implemented by GDPR have had a significant impact on the digital advertising, consumer insights, and market research industries.
We’ve all been there. You spend endless hours perfecting a survey and imagining your ideal audience fallout. Might as well set some quotas to ensure that desired outcome, right? But now, as you take a step back, those quotas are starting to look a bit unrealistic and a little overwhelming. Take a deep breath, because I’m here to help provide solutions for some common pitfalls when it comes to setting quotas.
Bold, creative, fast insights requires not only high quality respondents, but effective and efficient quality control methods. If you use untruthful data, you or your clients are ultimately being misguided in the business decisions that you are basing off of these facts and figures. So, what can we do to avoid this and drive ‘quality’ into the right direction?
Do I want to know? Yes - you need to know!
Today’s researcher is all about gathering data to create insights that provide guidance and strengthen the core business, future-proofing decision making for their company. In truth that only covers the ‘need-to-know’ part; there is a large share of ‘nice-to-know’ insights as well that are gathered for various reasons (e.g., to verify general thoughts or just to satisfy the board, directors, managers and other influential people within organizations). Need-to-know and Nice-to know insights are both based on data.
Open ended questions generally serve a specific purpose:
- What products/brand/advertisements do you recall?
- You said X, why did you say that?
- Would you purchase/recommend X? Why/why not?
These open-ended questions generate a wider spectrum of codes than the usual standard close-ended questions, especially when the codeframe is kept to a small list, a key aspect in any mobile first survey. However, open-ended questions can also produce less impactful data (sometimes termed as gibberish data) when stock standard question wording is used, as illustrated above. The key to making an open ended question valuable is to frame the question wording to be thought-provoking, making it meaningful for the respondents and making them want to share their views fully, in an environment where this is easy to do.
Many clients include quality checks in surveys to make sure respondents are engaged and are answering honestly. However, many of these checks identify false positives, which often mean valid, engaged respondents are thrown out of the sample. How can we reduce false positives?
In any context there are many questions with factual answers which are difficult to answer: “have you considered an affair?,” “how many vegetables do you eat?,” “how often do you go to the gym?” and “have you lied to your boss?” to name just a few.
When it comes to a question like “how much do you drink” it can be hard enough to be honest with ourselves let alone a researcher! Fortunately the anonymity and context of online research puts it in a unique position to secure honest answers to sensitive questions; however, this is no easy feat. When we ask a question there are many hurdles we must overcome to reach an honest answer.
Real time insights and predictive analytics build better strategies and better business performance. As we re-write the rules of marketing research, data has become the digital fuel to deliver genuine insights. However, as industry stakeholders, we must capture data that is insightful, not invasive.
This year’s CASRO Digital Conference concentrated on the collective knowledge of research in the digital space with a focus on three key areas: implementing Mobile First, focusing on the panelist experience and the emerging importance of video.