By Blink Staff
The ingredients are simple: two people, a quiet place to sit and talk, and a video camera to record the session. Still, getting the most out of an interview requires careful planning and a thoughtful technique. Here I share some insights that guide my own approach to interviewing.
Why a user interview?
Interviewing is often preferred over other self-report measures, such as questionnaires, when the goal is to understand issues that are difficult for people to articulate in a concise response (e.g., “How do you define effective teamwork?”). In my experience, verbal responses are typically longer, more spontaneous, and include more specific examples than written ones. The flexibility of the give-and-take user interview format allows the researcher to probe more deeply on issues that resonate and follow-up with additional questions to fill in missing details. Interviews are especially useful as a supplement to quantitative measures, such as preference ratings: The “whys” behind ratings are usually much more informative than the ratings themselves.
We typically incorporate some form of interviewing into all of our user studies. In usability studies, pre-session interviews allow us to gather background information about users to provide a frame of reference for interpreting study results, and post-session interviews provide a forum for participants to reflect on the system they just tested. In user research, contextual interviews pair one-on-one interviews with field observations to generate insights about how users function in their natural environments, and what goals, priorities, and perspectives they bring to key tasks.
There are some kinds of issues that user interviews are not well suited to capturing. For example, specific usability issues are best documented with detailed observational data, as users are not always able to report on and interpret patterns in their own behavior – for example, they may fail to notice errors or inefficiencies in their actions.
The researcher’s role
The researcher’s role in an interview goes beyond reading out a list of questions. A thoughtful interviewer:
- Sets the stage. Participants feel more comfortable and provide better information when they understand the focus of the study and how the information they are providing will be used. The researcher should describe the basic goals of the interview, set expectations about what questions will be asked, promise confidentiality of responses, and establish the tone by building a comfortable rapport.
- Gathers all the necessary information: The interview is designed around a set of key research questions, and it is the researcher’s job to ensure that he or she gathers information about all the important topics – rephrasing, repeating questions, and probing further as necessary.
- Gets participants to express themselves fully: Some participants are more articulate and expressive than others, but the goal is always to help each participant find ways to communicate the important details of their experience as fully and clearly as possible. Probing for examples, adjectives, feelings, and asking people to “walk me through” an experience are all good strategies for eliciting detailed accounts. Following-up on non sequiturs, hesitations, and subtle facial expressions can also reveal unexpected, rich content.
- Follows, doesn’t lead. Well-designed user interview questions are requests for information (e.g., “Tell me about your experiences with X…”); they don’t lead people to particular responses or put words in their mouths. On the contrary, the researcher should be following the respondent’s lead, taking cues from their mood and language as a way to build trust and shared meaning. For example, I like to repeat back a person’s own words when I ask follow-up questions for clarification or elaboration, – e.g., “You said you’re ‘not really a bells-and-whistles sort of guy.’ Can you say more about that?”
- Respects the participant’s perspective, empathizes: The researcher aligns him- or herself with the participant and communicates genuine concern and interest in what the participant has to say, for better or worse, making the interview a “safe space” to broach sensitive topics, reveal “embarrassing” misconceptions, and speak freely about likes and dislikes. The researcher treats the participant as the expert: User interview questions are legitimate requests to be educated, not demands for information.
- Rolls with the punches: The researcher must balance the need to get key questions answered with the need to let people tell their stories in a natural, spontaneous way. A flexible interview protocol and a conversational style allow the researcher to move smoothly between questions as specific topics arise, to gently redirect off-topic conversation, and still ensure coverage of the important issues.
Identifying and reporting findings
I think about data analysis as both a “top-down” and a “bottom-up” process. In the standard top-down approach, the study’s original research questions are the starting point. The researcher “interrogates” the interview data for evidence bearing on each key question, typically categorizing participant responses by question. In the bottom-up approach, the raw interview data serve as the starting point. The goal is not to interrogate the data, but rather to listen and receive the messages the data are sending. These messages (or themes) may map on to specific research questions, or they may reflect additional findings that “grow up between the cracks” of the study. Taking both approaches means that key research questions are prioritized while still making room for new ux interview questions and issues to emerge.
Good interview data pay off in final reports, where participant quotes and video clips give substance and personality to study findings. Through quotes and video, users speak directly to product teams, in their own voices, about who they are and what they want to see.
Recommended reading:
- Learning from Strangers: The Art and Method of Qualitative Interview Studies by Robert S. Weiss
- Discourse Analysis: Theory and Method by James Paul Gee