Why We Need to Avoid Long Surveys

 |  Posted: by
Long Surveys Cartoon

Long surveys are a plague. Writing short surveys is an uphill battle with many clients. Whenever there is a chance to do a survey, everybody close to the subject wants to add questions. The thought is, “since we are doing a survey, let’s get as much as possible out of it.”

Unfortunately, the only thing you get out with very long surveys is bad quality data.  Why?

Non-Response & Abandonment

As the survey length increases, so do the non-response bias and abandonment rate. In other words, respondents won’t stay too long answering questions. Many won’t even start if they know the survey length. At the same time, it is a best practice to announce the length of the survey in the invitation.

If you try to trick respondents by omitting how long the survey will be, think again. Respondents can always figure it out from the progress bar. They will leave in the middle of the survey if they perceive it as too long (even if no progress bar is shown).  Therefore, high abandonment and non-response rates affect sample representativeness negatively.

In an experiment conducted by Galesic and Bosnjac (2003) to prove this point, 3,472 respondents were divided into 3 groups based on an online survey with different lengths (10, 20 and 30 minutes).  The chart shows how the number of respondents who completed the survey declined as the survey length increased.

Survey Length and Response Rate

Data Quality

While some respondents are willing to endure a long survey, they are at high risk of experiencing high burden and becoming “satisficers.”

Satisfacing occurs when the respondents select the answer options without giving them too much thought. They go for the most effortless mental activity trying to satisfy the question requirement.  Mental work to find optimal answers that best represent an opinion is exhausting after a few minutes. 

Consequently, respondents may start selecting the first choice in every question, straight-lining in grid questions (selecting the same across all options), or simply selecting random choices without much consideration. This type of behavior renders the data worthless.

The same experiment by Galesic and Bosnjac was set to test the impact of survey length on data quality. The researchers used a variety of indicators to measure the impact, including response times, item response rate, length of answers to open-ended questions, and variability of answers to questions in grids.

Of all the indicators, the only one that seemed unaffected by survey length was item response rate (defined as the percentage completed from all questions presented in a block).

However, it is unclear if the survey forced respondents to answer before going forward.  For the other indicators, the results strongly suggest that survey length affects quality.

Survey Length and Quallity Data

Why Do We Write Long Survey?

There are powerful forces pushing clients and forcing research vendors to launch long surveys. Budget, time constraints, and different agendas from internal groups are some of them.

However, when surveys start getting too long, clients and research vendors should take a minute to think about the implications. After all, if we get bad data, we have wasted the little time and money we started with.

Comments Comments

Jason Anderson Posted: May 6, 2011

In addition to the data quality impact of long surveys, there are the consequences of having an extremely large pile of data. Analysis takes longer, reports bloat, presentations drag out into unmanageable lengths. Just as the survey-taker loses interest in participating in the survey, clients lose interest in deciphering the results.

Michaela Mora Posted: May 6, 2011

Excellent point Jason!

Giles Posted: May 9, 2011

While I agree that long surveys are often bad news, using a study into online survey behaviour from 2003 to inform you about today’s respondent behaviour is a little risky. An awful lot has changed since then… Maybe you’d be better off analysing some of your own more recent projects and seeing if the results are similar? I’ve read a lot on this topic, and there are other more recent studies out there (forgive me, I don’t have the details to hand and don’t have time to find them right now) which indicate that you *can* get good data from long surveys, and have low drop-out rates, but *only* if the survey is well-designed, interesting and relevant. Which not many are.
I’d still prefer to keep any given survey under 15 minutes, but I can’t see many people buying into that anytime soon. So an emphasis on good survey design is just as important as keeping it short.

Michaela Mora Posted: May 11, 2011

Giles,
I totally agree with you about the importance of good survey design and its impact on survey length perception. For example when we do Adaptive Choice-Based Conjoint studies (ACBC), which takes longer than traditional Choice-Based Conjoint (CBC) surveys, respondents find them more interesting due to task variety and response rates are not affected too terribly. However, the basic tenant about the negative relation between survey length and response rate and data quality as shown in the mentioned study still holds. First, there is a limit to how long you can make it. Respondents get tired, are short on time, etc. Second, as you said it yourself, there is a big “only if” condition related to good survey design, which I don’t much of it lately. Unfortunately, with the advent of so many cheap online survey tools, there are many inexperienced people writing long surveys that don’t know about survey design or are unaware of the impact of survey length.

Annie Pettit Posted: May 9, 2011

There are always excellent solutions for battling long surveys. Think about what you ‘want’ versus what you ‘need’ and get the ‘want’ somewhere else. Social media research is one great place to get the ‘want’ and you can even get way more than you had ever planned on.

Anand Gijare Posted: June 1, 2011

Hi,

I too agree with the point. My experience on long online surveys is… that the survey contains more information that keeps confirming the respondent information and overview of his company or background and after 40% of survey it starts routing to want questions and after 60% completion it goes for need and most needed answers. Here as said by Michaela most of the interest has already dropped and the answers one gets for key questions are absurd.And in such case the crux or key information is lost or is of a less quality. Most of the questionnaire have open ended questions at the end which eats interest of the respondent and whole purpose of the survey gets meaningless.
What we do is ask client what are the key questions he would need to get answers for and design a short questionnaire around it. And add annexture of another set of questions requesting him to go further if found interested. This provides you all necessary information and respondent can come back later or same time with other answers.

john Posted: June 2, 2014

Hi.
There is something I want to make sure of.. in your opinion, can I do a survey on a different time. in my case, I need to collect a 200 sample. I need to interview pedestrians to gain data about their feeling toward drivers and some of their demographic information. it’s hard to get 200 in one day. so can I collect it on a different day? and combine the data? is it valid?

Only logged in users can leave comments.