Categories
August 3, 2022
One of the bigger issues that we, as a supplier, have been focused on over the past 18 months is respondent engagement. For any of you that have participated in…
One of the bigger issues that we, as a supplier, have been focused on over the past 18 months is respondent engagement. For any of you that have participated in online research webinars or had the pleasure of attending in-person events over the past year and a bit, you have likely seen our series on data degradation as it related to respondent engagement.
The genesis of this research on research came from the occasional, yet somewhat inevitable, conversations that I was having periodically on trend data with clients. Tracking data, in some instances, was simply unreliable, or notably different wave to wave. One of the key criteria noticed in many of these conversations over the years was the placement of the question, well, in question. The issues tended to involve data collected later in the survey. Almost inevitably so.
It has been something that has pulled at me for a long time. It was clear that data collected later in a survey was less accurate, perhaps even less valuable than data collected earlier. “Why?”, was the question. Of course, one could increase sample size and rotate or simply stack priorities earlier. I get that, but it doesn’t answer the fundamental question of what is going on.
So, we finally decided to tackle it. And what we have found so far has been tremendous. Our process initially has been simple. Respondent engagement is the measurement. Essentially, the time a respondent puts into answering a particular question type is measured at various points in the survey. No, it’s not perfect, but compelling enough to lead us somewhere.
Spoiler alert, the first three to five minutes, the respondent is completely engaged, answering everything from open ends to rank sorts with effort and thought. Yes, that means there is drop off later and it is significant. Respondent engagement dramatically drops off after the three to five minute range and burns down after that point. Almost a coasting effect.
And no, it’s not familiarity bias. This isn’t exactly a state secret, but panelists do a lot of surveys and are part of multiple panels. They probably just finished a survey right before the one they are now doing for you. They are already familiar with rank sorts and open ends. From where I sit in the research chain, it’s engagement that matters.
What does this all mean to you, the researcher (or maybe the research aficionado)? Well, for one, please do consider stacking your most important data points earlier in the survey. Second, stop with the unnecessary screening questions and please, please stop with warmups. I can’t count how many surveys we see where the first 2-3 minutes are taken up with unnecessary screening, ‘warm up’ or ‘gotcha’ questions wasting prime engagement time.
Third, consider removing open ends from later in the survey. Respondents are simply not putting any effort into them as our data shows and anecdotally, we feel it even lessens engagement for all questions by simply asking open ends later in a survey.
Our end game here is simple. We are working with academia (stay tuned) to construct some form of a data degradation factor that you can apply to your research and are working with our clients to help them understand this phenomenon and how it affects their data. We aren’t dumping on long surveys; you hear enough about that. What we are saying is that there is a difference in data collected at various points of the survey due to engagement and here is the factor that you can apply to your analysis. We hope to get there soon!
For more, find the data degradation blog at questmindshare.com/blog and of course, I’m happy to chat and answer any questions as we continue down our road to understanding respondent engagement.
Comments
Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.
Disclaimer
The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.
Sign Up for
Updates
Get content that matters, written by top insights industry experts, delivered right to your inbox.
67k+ subscribers