Mixed Data Collection Modes – Round-Up

 |  Posted: by

Mixed Mode Data Collection Modes

 

One of the workshop sessions I attended on the first day of the 2011 Marketing Research Association (MRA) Annual Conference, in Washington DC., was about mixed data collection modes. One of the hottest topic in our industry today.

 

ROUND I

 

Annie Petit from Research Now showed examples of how online surveys, text messaging and social media research can be used in combination to get the best out of each. I totally agreed with Petit when she said there are not perfect methods. We have to use all methods that are available and get the best out of each.

 

For example, surveys are great for frequencies, and representative demographics; text messaging is good for on-spot live behavior, to check what people are seeing, eating, drinking right now; and “social mediaresearch is like doing a 10-hour survey,” which can give us tons of data, from thousands and millions of people on any one topic. We can use them all together. Examples:

  • Use social media research to identify the best options to put on a grid question (e.g. list of the most popular restaurants)
  • Use sharing of media research with respondents as an incentive
  • Combine social media and survey with diaries via text messaging (SMS) to understand behavior
  • Invite people to a SMS survey from an online survey
  • Use social media research to do a deep-dive into the results from an online survey or a SMS survey

 

ROUND II

 

In the same spirit, Stephen Murrill, from Meta Research presented  a case study of the mixed mode approach used for the California State Fair, which was interested in understanding:

  • Satisfaction drivers
  • Barriers to attendance
  • How to motivate families to maintain the tradition of coming to the Fair
  • What activities may attract teenagers

The Fair usually used exit surveys to measure satisfaction with the event, but now they needed to talk to teenagers and Non-Fair-goers for whom exit surveys were not an option. The solution was to combine data collection on-site using iPads to get feedback from Fair-goers, SMS surveys to reach teenagers that came to the Fair, and phone and online surveys to reach Non-Fair-goers.

 

ROUND III

 

Sima Vasa from Paradigm Sample and Leslie Townsend, from Kinesis Survey Technologies (now part of Focus Vision), joined forces to present a case study where the primary data collection mode was mobile surveys. The problem was presented by a CPG client wanting to gain insights about consumers of convenience stores (C-stores), particularly Millennials (18 – 34). The goal was to create a consumer panel of C-store shoppers and keep them engage to learn more about these consumers over time.

 

Since this age group often doesn’t participate in online surveys, the solution was to develop a customized mobile panel application, using the Kinesis platform, that users would download to their cell phones. This application allowed to send surveys and capture the C-store experience of this age group. The application was optimized for many different devices and detected which device were used, which allowed to target the surveys appropriately, so if a longer survey was sent, they could advise users to take the survey on a PC instead of on their cell phones.

 

ROUND IV

 

Finally, Rick Kelly from Opinionology/Survey Sampling, talked about a study done for the Tour of Utah cycling race to understand the impact of advertising during the race on purchase intent. Participants were given the chance to participate in the study  using one of three modes: a mobile survey (SMS), an online surveys, or an IVR survey. The mobile survey had 12 questions, while the other two modes had 20 questions. The survey was open during a whole month, achieving 10% response rate.  Among the results presented, we learned that the SMS survey was the most common mode used buy those answering the survey at the event, which were also younger respondents. Most of those who answered the survey online did so within 24 hrs, and participants over 55 were overrepresented among IVR survey respondents. The results also showed an increased in purchase intent over the time the survey was open, for which there was no clear explanation.  Of all the four presentations, this one left me wanting for more in order to understand how the different modes actually compared.

 

KEY TAKEAWAYS

 

Although I didn’t learned anything totally new from this workshop, the speakers made a good case in favor of using mixed data collection modes. However, we should first define what we need to know and who we need to reach and then decide which data collection modes are more appropriate for the research objectives.

 

Petit showed good examples of how we can incorporate social media research to support more traditional research methods. Use it to refine survey questions, dig deeper about survey results, recruit respondents or provide incentives.

 

Murrill showed a great example of how different modes can be used to reach different target markets, while Vasa and Townsend showed how new nechnologies can help ups gain insights about hard-to-reach market segments.

 

What I missed was a discussion about the major issue that combining different data collection mode has, namely,  the measurement error introduced by different data collection modes as they have an impact on how people answer questions. Maybe, next year.

Comments Comments

Steve Marks Posted: June 7, 2011

Michaela…you hit the proverbial nail on the head. getting all the information on ways to utilize mix-mode approaches is wonderful, as are really interesting case studies. the crux of the matter (at least to MR practitioners) is how you can (or cannot) synthesize data collected in various modes. glad to see the MRA is making strides to look into issues like this, but it needs to look at how to make sense of mix-mode survey data.

Only logged in users can leave comments.