On Monday we discussed the key elements that make for a great survey. Arguably one of the most feedback driven industries on the planet is the hospitality industry. Reviews on sites like TripAdvisor or Expedia can literally make or break a property which is why surveys have become a common feedback tool hotels use. You would think with the investments they are making in customer feedback programs that the surveys they design are effective. As you’ll see, even for the biggest brands, survey design seems to be foreign to them.
Asking too Much
The most common mistake made when designing surveys is asking far too many questions. As UX researchers we always have more questions about people and how they using our products. Couple that with market research questions from marketing and product management and before you know it you have a spreadsheet with 50 or 100 questions. Companies then eagerly deploy these comprehensive surveys that seek to learn everything about their customer. Unfortunately, this approach doesn’t work well when the customer is waiting in line on their phone and only has a couple minutes to spend filling out the survey.
Several weeks ago I received a survey from a hotel I stayed at recently. The survey contained around 30 questions covering all aspects of the hotel experience from check-in to loyalty program. The survey required following a wizard that went through page after long page of marketing questions. Here’s what their mobile survey looks like:
Think this is a hotel specific problem? Think again! Here’s a survey I recently received from Marriott, another leader in the hospitality space:
Even Hilton is in the business of complex drawn out mobile surveys:
This phenomenon impacts more than just hotels! I recently attended ACM’s CHI 2018, a major conference in the field of Human-Computer Interaction. You would think a conference that has a focus on methods of user research and user experience would put together a survey that was user friendly. However, that wasn’t the case:
This survey contained a whopping 52 questions covering everything from meal preferences to issues on diversity and gender equality at CHI conferences. Don’t get me wrong, those are all valid questions to consider amongst your community. At some point you have to draw the line between what’s critical data for organizing next year’s conference and details that can be investigated in subsequent studies.
How we communicate with customers is an important consideration throughout another value many organizations esposue. It’s especially critical for user research since participants can’t answer what they don’t understand. Equally important, they won’t participate in studies where the invitation is not clear.
This is the email I received from the ACM inviting me to complete its survey for the CHI conference:
I wonder how many people actually responded to the survey given how poorly thought out the invitation was?
When I did manage to find the link to complete the survey I came across this gem:
This section of the survey asks the same question twice, the only difference is the first time the question is posed it wants me to consider it regardless of whether I use the services listed. The second iteration of the question wants my personal experience using those services. Confusing enough for you? Initially I thought this section had a typo in asking the same question twice and erroneously answered the first question as though it was me using the services listed.
The takeaway from this is clear: always review your work on multiple devices to ensure the message and content is clear and concise.
Taking Things for Granted
Another category where surveys often fall flat on their face is demanding a lot from participants, particularly when trust hasn’t been built with them. The most common violation of this principle is asking participants to publicly review your product/service on TripAdvisor or Google. What’s crazy about this ask is you’re forcing customers to put their name on the line to publicly review your hotel just for the privilege of providing feedback in the form of a survey. Sadly this practice is common amongst the hotel brands:
Demographic data is the most often sought after research data from marketers who want to understand the audience they’re targeting. However, in light of recent privacy breaches (i.e. Cambridge Analytica), participants aren’t as comfortable providing personally identifiable information about themselves. Yet, we still see surveys collecting that data:
Organizations eager to see a high completion rate in their survey often send the same survey request out over and over again hoping at some point you'll respond.
Wrong Survey at the Wrong Time
A foundation of UX research is understanding people in the moment and context they experience a product or service. People are much better at recalling the nuances of their experience immediately after they experienced it. Yet, surveys are often sent out days or even weeks after I experience them. For instance, Expedia contacted me three weeks after I stayed at a hotel to request a review on it:
A corollary to this principle is not to ask participants of a study of behaviour that spans a long time frame. Clearly this hotel hasn't received the memo on this:
Bad survey design is all around us from the excessively long surveys that should “only take 5 minutes to complete” to confusing questions that mislead providing researchers with inaccurate or even contradictory data. Now that we’ve developed an understanding of the principles of good survey design and looked at some examples of particularly egregious surveys, it’s time for us to put this theory into action. On Friday we’ll design a survey system for ourselves, applying what we’ve learned so far.