Why We Need to Avoid Long Surveys

 |  Posted: by

Battling Long Survey

 

Writing short surveys is an uphill battle with many clients. Whenever the word is out that a survey will be conducted, everybody close to the subject, being the product team, senior management or operations, wants to add questions. The thought is, “since we are doing a survey let’s get as much as possible out of it.”

 

Unfortunately, the only thing you get out with very long surveys is bad quality data.  Why?

 

NON-RESPONSE & ABANDONMENT

 

As the survey length increases, so does the non-response bias and abandonment rate. Simply said, respondents won’t stay too long answering questions. Many won’t even start if they know the survey length (It is a best practice to announce the length of the survey in the invitation).

 

Survey Length and Response Rate

For those who think they can get away with it by not announcing how long the survey will be, think again. Respondents can always figure out the length from the progress bar and will drop in the middle of the survey if they perceive it as too long (even if no progress bar is shown).  High abandonment and non-response rates affect sample representativeness negatively.

 

In an experiment conducted by Galesic and Bosnjac (2003) to prove this point, 3,472 respondents were divided in 3 groups based on an online survey with different lengths (10, 20 and 30 minutes).  The chart above shows how the number of respondent who started and completed the survey declined as the survey length increased.

 

DATA QUALITY

 

Respondents, who are willing to endure a long survey, are at high risk of experiencing high burden and becoming “satisficers.”

 

Satisfacing occurs when the respondents select the answer options without giving them too much thought. They go for the most effortless mental activity trying to satisfy the question requirement, rather than work on finding the optimal answers that best represent their opinion.  Respondents may start selecting the first choice in every question, straight-lining in grid questions (selecting the same across all options) or simply selecting random choices without much consideration.  This type of behavior renders the data worthless.

 

The same experiment by Galesic and Bosnjac was set to test the impact of survey length on data quality, which was measured with a variety of indicators including response times, item response rate, length of answers to open-ended questions, and variability of answers to questions in grids.

 

Of all the indicators, item response rate (defined as the percentage completed from all questions presented in a block) was the only one that seemed unaffected by survey length, however it is unclear if the survey was programmed to force respondents to answer before going forward in the survey.  For the other indicators, the results strongly suggest that survey length affects quality.

 

Survey Length and Data Quality

 

There are powerful reasons that push clients and force research vendors to launch long surveys. Budget, time constraints, and different agendas from internal groups are some of them. However, when surveys start getting too long, clients and research vendors should take a minute to think about the implications. After all if we get bad data, we have wasted the little time and money we started with.

Comments Comments

Jason Anderson Posted: May 6, 2011

In addition to the data quality impact of long surveys, there are the consequences of having an extremely large pile of data. Analysis takes longer, reports bloat, presentations drag out into unmanageable lengths. Just as the survey-taker loses interest in participating in the survey, clients lose interest in deciphering the results.

Michaela Mora Posted: May 6, 2011

Excellent point Jason!

Giles Posted: May 9, 2011

While I agree that long surveys are often bad news, using a study into online survey behaviour from 2003 to inform you about today’s respondent behaviour is a little risky. An awful lot has changed since then… Maybe you’d be better off analysing some of your own more recent projects and seeing if the results are similar? I’ve read a lot on this topic, and there are other more recent studies out there (forgive me, I don’t have the details to hand and don’t have time to find them right now) which indicate that you *can* get good data from long surveys, and have low drop-out rates, but *only* if the survey is well-designed, interesting and relevant. Which not many are.
I’d still prefer to keep any given survey under 15 minutes, but I can’t see many people buying into that anytime soon. So an emphasis on good survey design is just as important as keeping it short.

Michaela Mora Posted: May 11, 2011

Giles,
I totally agree with you about the importance of good survey design and its impact on survey length perception. For example when we do Adaptive Choice-Based Conjoint studies (ACBC), which takes longer than traditional Choice-Based Conjoint (CBC) surveys, respondents find them more interesting due to task variety and response rates are not affected too terribly. However, the basic tenant about the negative relation between survey length and response rate and data quality as shown in the mentioned study still holds. First, there is a limit to how long you can make it. Respondents get tired, are short on time, etc. Second, as you said it yourself, there is a big “only if” condition related to good survey design, which I don’t much of it lately. Unfortunately, with the advent of so many cheap online survey tools, there are many inexperienced people writing long surveys that don’t know about survey design or are unaware of the impact of survey length.

Annie Pettit Posted: May 9, 2011

There are always excellent solutions for battling long surveys. Think about what you ‘want’ versus what you ‘need’ and get the ‘want’ somewhere else. Social media research is one great place to get the ‘want’ and you can even get way more than you had ever planned on.

Anand Gijare Posted: June 1, 2011

Hi,

I too agree with the point. My experience on long online surveys is… that the survey contains more information that keeps confirming the respondent information and overview of his company or background and after 40% of survey it starts routing to want questions and after 60% completion it goes for need and most needed answers. Here as said by Michaela most of the interest has already dropped and the answers one gets for key questions are absurd.And in such case the crux or key information is lost or is of a less quality. Most of the questionnaire have open ended questions at the end which eats interest of the respondent and whole purpose of the survey gets meaningless.
What we do is ask client what are the key questions he would need to get answers for and design a short questionnaire around it. And add annexture of another set of questions requesting him to go further if found interested. This provides you all necessary information and respondent can come back later or same time with other answers.

Only logged in users can leave comments.