Interactive questions have been a part of the researcher’s toolkit for some time now, and there have been many lessons learned over the years on how to utilize them to benefit research and avoid the pitfalls that come along with them. During our recent webinar, we received a number of great questions, which we’ve tackled below.
Is the respondent’s age a factor when using interactive questions? For example, is a drag and drop ranking too techy for some OR are they finding that it has made it simpler?
This is a great question and not one that we’ve studied in particular. Nor am I aware of any research that addresses it. At FocusVision, we do have a team of UX designers that research and thoroughly test the interactive questions we employ for ease of use across desktop, tablet, and smartphone devices. We make every attempt to develop question types that are user-friendly for all. Some interactive questions e.g. virtual magazine/page turner are, by its nature, not going to be suitable for everyone in all cases, nor is it smartphone friendly.
Can you have too many interactive questions in a survey?
The trick here is to employ interactive questions in a way that don’t overly tax the respondent, compared to the standard HTML-based question types. We want to minimize the amount of effort required to complete a survey task. In our work with sliders, having a survey that consists completely of sliders was less enjoyable for desktop respondents. Dragging a slider is slightly more difficult than a simple point and click task. A slider here or there would likely have been okay, but a survey consisting solely of sliders adds up and was a bit much.
For your card sort option (a mobile-friendly alternative for a grid formatted question), since one attribute at a time is displayed, how does that impact interview length?
This question type does take somewhat longer to complete than a standard grid formatted question. But we also found that it reduces dropout rates, so we feel this question type has a benefit. We also hope the extra time spent on the card sort means respondents are paying a little closer attention when considering each attribute as now they can focus on them one at a time!
Are surveys automatically scripted for mobile and desktop or does that need to be specified from the beginning?
Our interactive questions have been designed to be mobile friendly and mobile compatible unless otherwise indicated. In fact, a key benefit of interactive questions is we now have the technology and flexibility to custom design a question type that is compatible and user-friendly for a smartphone, tablet, and desktop computers alike. Our surveys are capable of detecting which device the survey respondent is using and then automatically renders the survey and the question type in an appropriate and user-friendly way for that device and screen size.
What’s the best color to use to avoid bias?
Avoid using colors as emotional signifiers. For example, don’t use red to indicate “stop” or something negative; green as something positive. That can introduce an unexpected bias as people interpret colors in different ways. Keep colors uniform and balanced. Use colors to create light/dark contrast. For example, a light background and dark foreground can help a scale labels stand out. Or changing the state of a button to a darker color can indicate a selected item.
In one example, you show a button form with a checkbox symbol on it to denote a multi-select question. Is this questions type available to me?
Yes, look for the “Button Multi-select” and “Button Single-select” option. These are pre-formed question types that you can choose from when building out your survey. Once you’ve added all your questions, don’t forget to check out our “Themes” section as well. There you will find pre-configured options for customizing the look, feel, and color of your buttons.
What was the most surprising finding from your research on interactive questions?
I sometimes see an over-tendency to use interactive questions to “jazz” up a survey and make it look fancy or visually stimulating. No doubt, interactive questions look great. But as I showed in the webinar, even the simplest of things can lead to unintended data bias or confusion from respondents. That was certainly the case in some of the initial development work with did with our interactive questions and it took some rounds of testing to get them right. Not so much a ‘surprise’ per se, but an important thing to keep in mind when using interactive questions.