UXR Do's and Don'ts

Questions like, “Do you understand X?” and, “Is this clear to you?“ drive me crazy!

 

We don’t ask “Do you understand X?”

We don’t ask “Is X clear to you?”

 

First, these are both leading questions that suggest a particular answer and/or will likely influence the response. The wording instills pressure to respond in a certain way, resulting in an inaccurate representation of their true thoughts or feelings. In UXR, it's important to avoid leading questions in order to gather unbiased and uninfluenced feedback.

Second, your participant’s interpretation of X may differ from the intention so if even they respond "YES", you still don’t have a clear reading. Why? Because these two questions will not ascertain whether a participant understands something “as intended”, or not.

To gather feedback on “understanding/interpretation”, consider this instead:

 

Multiple options to gather feedback

 

Comprehension questions

These are questions that help reveal if they have a clear understanding of the "intended function or purpose", or not. These questions could include:

  • "What does X do?"

  • "How would you describe X to a friend or colleague?”


​Verification tasks

These are tasks that participants perform related to X. You can observe whether they are able to perform the task correctly, or not, which indicates understanding.

Learning if there is a gap between their interpretation and the intent, and what that gap is, is KEY for the team to discover which aspects are clear, which are not, AND WHY. For example:

  • Is X unclear because it’s not communicated with a simple description?

  • Is X unclear because the name is unfamiliar/misleading?

  • Is X unclear because isn’t listed where the participant expected to see it?

  • Is X not accessible, available or prominently featured (enough)?

  • Does the X name or description not align with their existing mental model?

  • Something else?

​​Words, names, and messages matter, especially in UX.

NOTE: To verify is “to establish the truth, accuracy, or reality” whereas to validate is “to recognize, establish, or illustrate the worthiness.” Verification and validation are different!


Other ways to ask open questions:

User research focused on literal content is one way to evaluate the success of our written materials.

 
 

Here are other ways to gather input on comprehension, clarity, preference, naming, and or navigation.

I am going to describe a [concept/feature/function] to you. Please tell me in your own words…

  • What, if anything, does it remind you of

  • The first thing that comes to mind is

  • If you’ve seen or heard of anything similar and if so, what

  • What you might call this

  • If anything resonates with you or not

 

Quality questions

  • Are asked at the right time

  • Presented in a neutral manner

  • Use simple and natural language (no jargon or acronyms)

  • Take your research approach and data collection method into primary considerations

  • Relate directly to your research goals and objectives

  • Take “open and closed” question approaches into consideration

  • Are actionable, approachable and easily understood by your audience

 

I’m going to share the name of a [concept/feature/function] with you. Please tell me in your own words…

  • What do you think it does?

  • The benefits it might offer?

  • How it might work?

  • How do you think it it’s similar or different to _____?

I’m going to share a [blurb/message/name] with you.

  • What do you think it is trying to [communicate/convey]?

  • How do you interpret it?

  • How do you think it is related to ____, if at all?

  • How closely does this compare with your expectations?


Here are some more question "no-nos" when striving to create an environment for participants to respond with unbiased, uninfluenced feedback.

 

We don’t ask “Is this something you expected to see?”

We don’t ask “How willing would you be to purchase on the site?”

We don’t ask “Has your image of X changed based on what you saw?”

 

Upcoming Events

 
 
  • February 15th: "Democratizing User Research: Risks & realities from the front lines" Webinar sponsored by HeyMarvin. Register here. 10 - 11 am PST.

 
 
  • February 15th: "Edit UXR Video Like A Pro" workshop led by Miles Hunter (Ask Like A Pro Alumni Event open to the public). Register on Eventbrite 2 - 4 PM PST.

 
 
  • February 20th: "UX Research for Product Messages" event led by Jen Havice. (Ask Like A Pro Alumni Event open to the public). Register on Eventbrite 4 - 5:30 PM PST.

 
 
  • February 21st: Is Ask like A Pro Right for Me? Q&A with Alumni (Improve your UX Research Skills with "Ask Like A Pro"). Register on Eventbrite 5 - 6:30 PM PST.


Speak up, get involved, and share the love!


And that’s a wrap!

We try to alternate between a theme and UX/UXR jobs, events, classes, articles, and other happenings every few weeks. Thank you for all of the feedback. Feedback is a gift, and we continue to receive very actionable input on how to make Fuel Your Curiosity more meaningful to you.

What do you think? We're constantly iterating and would love to hear your input.

Stay curious,

- Michele and the Curiosity Tank team

PS: We're only offering two public Ask Like A Pro cohorts this year, and the next cohort starts February 26th! If you’re ready to register click here to grab your spot! Need a little more info? Join our Live Info Session with Alumni on February 21st.



Previous
Previous

Learn how to leverage your user research to find the stickiest messages and boost conversions

Next
Next

Join us to talk about UX/UXR as an independent