Almost everything has been disrupted in some way as a result of this global pandemic, even research. One of the most common questions I hear these days is, ‘what about the data?’ An extremely pertinent question. After all, bad data isn’t going to get us very far. I’ve found the question roughly falls into three buckets: Response Rates, Data Quality, and Data Validity. Let’s look at each in turn.
Not long after The WHO declared COVID-19 a global pandemic on March 11th, governing bodies started to implement various shelter in place orders, and at this point, we collectively started to wonder if research participants – for surveys and qualitative studies – would still be willing to respond. Early indications showed that people weren’t only willing but eager to participate in research. This trend has continued as we approach the six-week mark.
We know that people are responding, which is excellent news. But what about Data Quality? If you have trending data, for example, a brand tracker, will the data be different? It is necessary to explore this in a few ways. The first is whether the data has changed due to differences in sample composition. Are there discernible differences in the demographics or psychographics of those responding to the research compared to previous waves? And if so, investigate whether weighting will address the changes. If not, proceed with caution.
Next, it’s important to continue with data quality checks to identify speeders, straighliners, and nonsensical responses to open ends. Anecdotally I can say I’ve been impressed by the data quality on two short pulse surveys that I recently run; most notably, the opened-ended data has been rich with longer, thoughtful responses.
Lastly, this brings us to understanding if the data is different because times are different. Most people are feeling or beginning to feel the strains of shelter in place. There are also real health and financial challenges facing large groups of the population. In this event, while the data may be different, the quality is intact.
This leads us to questions of Data Validity. If people are responding and the data is sound but different, is this data valid? Absolutely. These data changes are reflecting real changes in your participants’ lives and offer valuable insights at a time when there is no playbook.
Perhaps a better question to ask is, how long does the data remain fresh? How long will our findings from this month, or even this week, hold true? Like many questions, the answer is it depends mainly on what you are asking and whether it is something that often fluctuates. It is fair to say that yes, people’s attitudes and opinions will continue to shift throughout the coming weeks and months. It’s for this reason that I firmly believe (and am seeing some clients make this shift) that quick-turn pulse surveys are a great way to keep abreast of what’s going on with your customers and allow you to adapt and iterate as the landscape changes.