Last updated Aug 2020
Complete guide to UX research
Josh Zak,
Founding Partner at Turtle Design
For nearly a decade, Josh has designed world-class experiences for tech companies. A co-founder and managing director at Turtle, he uses UX strategies to design winning digital products.

How to Conduct UX Research Surveys & the Best Questions to Ask

Conducting user research is one of the most important things you can do when designing or improving a product or service. When designing a UX survey, there’s no way to be perfect – in my experience, there will always be learnings and ways to improve. That said, let’s dive into some pointers, tips and tricks I’ve picked up along the way during my UX research career.

Keep reading to learn more about UX surveys and how to craft unbiased questions for them.

What is a UX research survey?

A UX research survey is a set of questions, sent to a targeted group of users, that probes their attitudes and preferences. Surveys can be a quick, easy, and inexpensive way to obtain data questions you ask. A poorly designed survey won’t yield valuable insights.

Crafting questions for your survey will depend on what you’re looking to achieve – there’s no magic list of questions that can be copied and pasted. In order to form good questions, you must first understand the customer’s pain, and also know how to form questions that won’t bias your data. Here are a few tips to help you:

1. Start with the customer problem

Understanding your customer’s problem is an important first step before deciding which questions to ask. Sometimes there are so many problems, it’s hard to know where to start. Here’s an activity created by Julia Cowing, a Senior User Researcher at MailChimp, to help define customer problems and align on priorities.

2. Understand question types

When it’s time to form the questions for your customer survey, not all questions are created equal. To avoid biasing your data, consider the following types of questions.

Good question types:

  • Task-driven feedback questions (e.g., “Tell me about your experience using your current Banking App.”)

  • Open-ended questions about expectations or impressions (e.g., “What is your favorite feature?”)

  • Follow-up questions (e.g., “How would you rate your experience of the app?”)

Bad question types:

  • Yes or no questions. If they can be answered with a “yes” or “no,” they’re closed questions – meaning you can’t really probe for more information. These aren’t very helpful when you’re trying to dig deeper into a customer’s mindset.

  • Assumptive questions. Specifically, we mean questions that assume a positive or negative experience, like: “What did you hate most about this feature?”

  • Leading questions. Going back to bias, this is the kind of question that encourages a desired answer. Like, “If you enjoyed this product, should we create more like it?”

  • Funneling questions. These are general questions that drill down to a specific point – kind of like how detectives question a witness by asking for more and more detail about one specific thing.

Example questions

Let’s say a bank’s design team is looking to redesign the bank’s online and mobile app, where users access their banking information, check their bank balances, make payments on their cards, etc. Before they start designing they want to confirm areas of improvement on the app. They start by asking customers what they think of the current bank app. Throughout the interview process they want to get the most honest answers while avoiding getting false validations.

  • Example 1: "What do you like about the current Banking App?" This is a bad question because it assumes a positive experience. A better question would be: "Tell me about your experience using your current Banking App."

  • Example 2: "Was using the app for the first time easy?" This is bad because it's a yes or no question and is assumes a positive experience. A better question would be: "What were your impressions of the on-boarding experience within the app?"

  • Example 3: "Would you rate the usability of the app as good? Why or why not?" Again, this assumes the customer has had a positive experience. A good alternative would be: "What would you rate the usability of the app? Why?"

  • Example 4: "Do you use (x) feature?" This question asks the user something they can give a yes or no answer to, and is a funning the user. A better question to ask would be: "What features do you use the most on the app?"

  • Example 5: "Was this feature confusing?" This example assumes a negative experience. Try something more open ended, like: "What does this feature mean to you?"

Best practices for conducting UX surveys & interviews

The questions you ask in a UX research survey depend on what you’re trying to discover. But there are some best practices you can follow that will maximize your chance of success.

1. Keep things short and simple

You want to make it as easy and uncomplicated as possible to fill out your survey, so be brief and clear. Only ask questions that you plan to analyze and are necessary for what you’re trying to discover, and don’t include any irrelevant background information, unless people will need that to understand the question at hand.

2. Be clear

Avoid jargon, acronyms, and other language that could cause confusion among your audience. Also, pre-test your survey to make sure each question is asking about one thing only – it’s a common mistake to inadvertently ask about multiple features in the same question, which makes it hard for the respondent to answer accurately. You may want to show your questions to multiple test users or stakeholders before you send out the actual survey, to make sure everything is clear.

3. Check for bias

Bias can sneak into even the most carefully written survey. For example, you might ask, “How difficult to use is this product?” This phrasing subtly pushes the reader towards the idea of difficulty and is considered a leading question. Instead, you could ask, “Is this product easy or difficult to use?” and provide a range of options to choose from.

Here are just a few of the most common biases you should be aware of:

  • Confirmation bias: Basically, this is when you only ask questions that confirm your own hypothesis.

  • Framing effect: This refers to how framing your questions can influence responses.

  • Hindsight bias: AKA the tendency for people to think events that have occurred are predictable than they actually were.

  • Serial position effect: People tend to favor things that are at the beginning or end of a list.

  • Illusion of transparency: This refers to how people overestimate the extent to which they know what the other person is thinking.

  • Clustering bias: This means finding patterns among randomness when there aren’t any.

  • Implicit bias: This is a big one that’s hard to check. It refers to how people have unspoken associations about different groups and their behavior.

  • Fundamental attribution error: This is a cognitive bias where people assume a person’s actions depend on what “kind” of person they are, rather than considering social and environmental elements to be a factor.

4. Structure the survey well

One common way to structure a survey is to ask the simplest questions first and work up to more complex questions near the end. Going back to bias, be sure to make sure the questions early on in your survey don’t influence the answers to questions later in the survey.

5. Provide incentives

Without financial incentives, you run the risk of having no-shows at your interview. But how do you provide incentives? How much and for whom? I recommend doing a few things:

  • Separate cohorts of users into tiers. You may have a tier for general users, where you’re just looking for demographic or behavior information. Then more specific tiers for users with specialized knowledge in a specific area.

  • Provide financial incentives for each tier. Perhaps it’s more valuable for you to dig deeper into a specific cohort of users, therefore you can provide a greater financial incentive.Many companies will offer much higher incentives to secure specific user participants.

  • You can offer compensation in a variety of ways. Direct payments via Paypal or gift cards at places like Amazon are often the most popular forms.

6. Gather insights as a team

Miro provides our team with a great way to form user interview questions, then capture feedback from live interviews in real time. Try out this template to get started.

Learn which tools you'll need to conduct UX research

Wondering which tools to use to conduct your UX research? Check out the next chapter of our guide>>


What do remote UX Teams love doing in Miro?

  • Creating affinity maps, personas, and customer journey maps

  • Brainstorming and collaborating on projects

  • Running remote design sprints

  • Sketching out or iterating prototypes

  • Documenting everything together

  • Presenting their work

Learn more about Miro’s free online whiteboard tool>>

Get Miro app
Add ideas, digitize sticky notes, and leave comments on the go with Miro mobile app