Mixed Data Collection Modes – Round-Up

Summary: Mixed data collection modes are being used to reach low incidence consumer groups. However, combining data collection modes introduces measurement errors.

5 minutes to read. By author Michaela Mora on June 6, 2011
Topics: Analysis Techniques, Market Research, Survey Design

Mixed Data Collection Modes – Round-Up

One of the workshop sessions I attended on the first day of the 2011 Marketing Research Association (MRA) Annual Conference, in Washington DC., was about mixed data collection modes. One of the hottest topic in our industry today.

Round I

Annie Petit from Research Now showed examples of how online surveys, text messaging and social media research can be used in combination to get the best out of each. I totally agreed with Petit when she said there are no perfect methods. We have to use all methods that are available and get the best out of each.

For example, surveys are great for frequencies, and representative demographics; text messaging is good for on-spot live behavior, to check what people are seeing, eating, drinking right now; and “social media research is like doing a 10-hour survey,” which can give us tons of data, from thousands and millions of people on any one topic. We can use them all together. Examples:

  • Use social media research to identify the best options to put on a grid question (e.g. list of the most popular restaurants)
  • Share media research with respondents as an incentive
  • Combine social media and survey with diaries via text messaging (SMS) to understand behavior
  • Invite people to an SMS survey from an online survey
  • Use social media research to do a deep-dive into the results from an online survey or an SMS survey

Round II

In the same spirit, Stephen Murrill, from Meta Research presented  a case study of the mixed-mode approach used for the California State Fair, which was interested in understanding:

  • Satisfaction drivers
  • Barriers to attendance
  • How to motivate families to maintain the tradition of coming to the Fair
  • What activities may attract teenagers

The Fair usually used exit surveys to measure satisfaction with the event, but now they needed to talk to teenagers and Non-Fair-goers for whom exit surveys were not an option. The solution was to combine data collection on-site using iPads to get feedback from Fair-goers, SMS surveys to reach teenagers that came to the Fair, and phone and online surveys to reach Non-Fair-goers.

Round III

Sima Vasa from Paradigm Sample and Leslie Townsend, from Kinesis Survey Technologies (now part of Focus Vision), joined forces to present a case study where the primary data collection mode was mobile surveys. The problem was presented by a CPG client wanting to gain insights about consumers of convenience stores (C-stores), particularly Millennials (18 – 34). The goal was to create a consumer panel of C-store shoppers and keep them engage to learn more about these consumers over time.

Since this age group often doesn’t participate in online surveys, the solution was to develop a customized mobile panel application, using the Kinesis platform, that users would download to their cell phones. This application allowed to send surveys and capture the C-store experience of this age group. The application was optimized for many different devices and detected which device was used, which allowed targeting the surveys appropriately, so if a longer survey was sent, they could advise users to take the survey on a PC instead of on their cell phones.

Round IV

Finally, Rick Kelly from Opinionology/Survey Sampling talked about a study done for the Tour of Utah cycling race to understand the impact of advertising during the race on purchase intent.

Participants were given the chance to participate in the study using one of three modes: a mobile survey (SMS), an online survey, or an IVR survey. The mobile survey had 12 questions, while the other two modes had 20 questions. The survey was open for a whole month, achieving a 10% response rate. 

Among the results presented, we learned that the SMS survey was the most common mode used by those answering the survey at the event, which were also younger respondents. Most of those who answered the survey online did so within 24 hrs, and participants over 55 were overrepresented among IVR survey respondents. The results also showed an increase in purchase intent over the time the survey was open, for which there was no clear explanation. 

Of all the four presentations, this one left me wanting for more in order to understand how the different modes actually compared.

Key Takeaways

Although I didn’t learn anything totally new from this workshop, the speakers made a good case in favor of using mixed data collection modes. However, we should first define what we need to know and who we need to reach and then decide which data collection modes are more appropriate for the research objectives.

Petit showed good examples of how we can incorporate social media research to support more traditional research methods. Use it to refine survey questions, dig deeper about survey results, recruit respondents, or provide incentives.

Murrill showed a great example of how different modes can be used to reach different target markets, while Vasa and Townsend showed how new technologies can help us gain insights about hard-to-reach market segments.

What I missed was a discussion about the major issue that combining different data collection modes has, namely,  the measurement error introduced by different data collection modes as they have an impact on how people answer questions. Maybe, next year.