Survey Design : A Primer

May 21, 2018

Executives at a hotel chain facing slumping sales believed the problem was caused by lack of customer awareness in the brand. They invested millions on out of the box promotions yet revenue continued to drop. Desperate, they reached out to customers to get their feedback.  Loyal customers responded that they abandoned the hotel ever since the hotel eliminated perks they thought customers weren’t using like free breakfast. The changes weren't large enough to make guests complain vocally to the hotel. Only in surveys was this insight discovered. The hotel reverted the changes and within weeks bookings at properties picked up.

 

As designers and UX practitioners we often fly in the dark having hunches of what people think of our product and how they'll use it. Robust surveys provide us with a rapid and inexpensive way for our customers to communicate this vital data. It’s no surprise then that they are being deployed everywhere be it while waiting on hold with technical support or as an email after getting our car serviced at the dealer. Yet surveys often do more harm than good, giving contradictory and often unreliable data.

 

Long surveys, confusing or irrelevant questions, or being asked to complete a survey are common gripes with today’s surveys. When was the last time you completed a survey let alone enjoy doing one?  While we have largely embraced surveys, little thought is put into the design of them. Effective survey design can often be the difference between no meaningful insights and a strategic plan to address the key pain points of customers.

 

Let’s consider the key questions of survey design and the role they play on a survey’s effectiveness. 

 

Why? 

 

Participants won’t complete surveys unless there is something in it for them. Prizes are by far the most common incentive organizations use to convince people to complete surveys. Studies have shown that something as simple a $100 Amazon gift card draw can make a huge impact on response rates. Discussing the impact customer feedback had on improving their product/service is another effective way to prove you’re serious about surveys. 

 

Regardless of the approach you use, these incentives need to be concisely communicated to participants throughout the journey of the survey:

  • Inside the email inviting them to the survey

  • Between a set of survey questions if the survey must be long

  • At the end of the survey to reinforce the value of the time they spent so they answer surveys in the future

 

Remember potential participants may only have seconds to review your email invitation while skimming their inbox!

 

What? 

 

Focus on the top 5 questions you want to ask customers. Although you may have dozens of things you want to know, you’ll lose participants who fear multiple pages where there is uncertainty on the length of the survey. One approach to consider is changing the questions shown  so that some participants see one set of questions and other participants see another set of questions. So long as you have a high enough response rate (and participants) you’ll find the answers you’re looking for.

 

Once you’ve determined the questions that need to be asked, find people in your organization to complete the survey, measuring the time it takes for them to complete it. This will also give you an opportunity to spot confusing phrasing in the question or provided responses. Research has shownthat only 10% of participants complete a survey which takes 15 minutes or more. Hence, you should also honestly communicate the time you estimate participants will spend completing the survey. 

 

Who? 

 

Identify the audience for your survey. Questions that make sense for one audience won’t for another. Adobe Illustrator has several user groups including graphics designers, industrial designers and UI designers. The needs of a industrial designer, who often highly technical illustrations are quite different than a UI designer who is more concerned about creating “pixel perfect” layouts for their websites and applications. Similarly, people who have a close relationship with your organization/product will have different requirements than those who have started using your product. The insights you’re looking for in those groups will be quite different (i.e. how did they become a customer vs how do we keep them being loyal?). New customers will also likely need to have more incentives to complete a survey since you haven’t established a level of trust with them yet.

 

How? 

 

The survey design toolbox contains a variety of different ways of eliciting answers from customers. The tool you’ll use depends on the type of question you’re looking to answer (i.e. what versus why?), how you would like to analyze the data and most importantly design constraints (i.e. some question types are easier to answer than others). Each tool has their own costs and benefits, which we’ll now discuss.

 

Binary

 

Nothing is simpler than asking participants a simple yes/no question. They don’t need to elaborate on why they’ve answered a question a specific way or assign a subjective rating to a question. These questions are great when you need to determine the presence of some behaviour from a participant. However, these questions will never answer the why behind the behaviour.

 

 

 

Multiple Choice

 

Presenting participants with a set of choices to choose from is slightly more complicated, particularly when the number of choices are large. You need to watch out for choices that overlap in some way as participants may be confused at which choice is the most appropriate in their circumstance. This style makes sense when you have a small set of distinct possibilities and you want to find which hypothesis is most accurate. In other words you are converting multiple hypotheses to one. 

 

 

 

Check All that Apply

 

A variant of multiple choice are questions which prompt the participant to choose all options that apply to them. These questions are great when you have an exhaustive list of ideas/choices and you want to narrow that list down into something manageable. Typically, the results of these types of question are a top 5 list of the most commonly selected choices. If those top 5 results were too close you could then convert the question into a multiple choice type question to force participants to select one option over another. In any event you’ll have a sense of what participants priorities are towards a given question.

 

 

 

Likert Scale

 

When you want to assess how strongly a participant’s belief is towards a given belief your organization has, use a Likert scale. A Likert scale is similar to a multiple choice type question where the participant can only select one response. However, unlike multiple choice the participant is shown a statement and asked how strongly they agree with that statement. The most common types of scale are 5 point (Strong Disagree, Disagree, Neither Agree nor Disagree, Agree, Strongly Agree) or 7 point scale. However as you’ll see below the scale can be fine tuned based on the needs of the survey.

 

To measure the results of this type of survey you’ll assign a score to each of the possible responses. For instance on a 5 point scale you could have Strongly Disagree be a 0, Disagree be a 1, Neither Agree nor Disagree be a 2, and so on. After you’ve assigned a numeric value to each response, you’ll want to find the mean of the responses (i.e. add up all the numeric responses then divide by number of responses) and you’ll be left with a number between 0 and 4. This mean value will give you a sense of the "average” person’s sentiments towards a given statement.

 

 

 

Open Ended

 

Often neglected for good reason, open-ended questions allow you to get detailed feedback from participants on a particular topics that can’t simply be answered via a multiple choice question. They can be broad, like the question below or can be focused on a specific question/topic (i.e. How do you use Illustrator to create a wireframe?). They are great when you want to understand the why behind a user’s behaviour or don’t yet know what the answer(s) could be.

 

On the other hand, open ended questions pose their own challenges. They generally take much longer for participants to answer since they have to decide what to write then type out phrases or sentences rather than merely selecting an option from a list. They are also less likely to be answered by participants. Hence, these types of questions aren’t well suited for smartphones with their cramped keyboards. Analyzing the results are another concern with open ended questions. Unlike the other types of questions, the result of an open ended question are words not numbers. A way around this issue is to assign responses to categories (called coding) and count how frequently those categories are cited in the response. Some responses can also valuable by themselves particularly when they are compelling and uncover an insight no other data point has revealed. 

 

 

When? 

 

Timing is everything when it comes to surveys. You’ve only got a limited number of opportunities you can engage participants to complete a survey before they get annoyed. Studies have shown that participants are more likely to respond to a survey when they received them at the beginning of the day and are much less likely to respond during holidays. Similarly, if they’ve received a survey reminder on their iPhone they likely won’t have much time to complete the survey. Hence such a survey should be kept very short (i.e. 1-2 questions). 

 

Where

 

Where is the customer and what are they doing? If they’re calling your technical support line and are on hold, ask them questions relating to the issue they’re facing. It will also keep them engaged on the phone, distracting them from the time they’re wasting waiting for service.  Tools like customer journey maps and service blueprintscan uncover additional opportunities to use tailored survey to uncover customer insights. 

 

Summary

Once you’ve thought through these questions carefully you’ll end up with dozens of short surveys that can provide a 360 degree view of your organization allowing you to understand all your stakeholders and touch-points they’ll encounter in their journey. 

 

A company like Adobe might deploy dozens of surveys for their Illustrator to cover the following types of contexts:

  • Expertise of User

    • Beginner

    • Novice

    • Intermediate

    • Advanced

    • Expert

  • Profession

    • Graphics Designer

    • Industrial Designer

    • UI Designer

  • Where Survey is Presented

    • On Smartphone

    • On Desktop

    • Tech Support

      • In Product

      • Phone

      • Online Chat

    • Inside Product

      • On First Use

      • When Feature is Used for First Time

      • When a Set of Features are Used Together

      • When Product is Being Uninstalled

      • At Beginning/End of Trial

 

In the coming days we'll deconstruct several surveys I’ve received in the past couple of weeks to illustrate the common mistakes even major companies make when designing surveys. Finally, we’ll use our process to design a better survey so that you can see this strategy put into action.

Share on Facebook
Share on Twitter
Please reload

Featured Posts

Surveys Gone Good

May 29, 2018

1/2
Please reload

Recent Posts
Please reload

Archive
Please reload

Search By Tags

I'm busy working on my blog posts. Watch this space!

Please reload

Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square

Follow me

© 2019 James Simpson
 

Contact

j2simpso[at]uwaterloo[point]ca

  • LinkedIn
  • Twitter