
Written by
Blink StaffThe ingredients are simple: two people, a quiet place to sit and talk, and a video camera to record the session. Still, getting the most out of an interview requires careful planning and a thoughtful technique. Here I share some insights that guide my own approach to interviewing.
Written by
Blink StaffThe ingredients are simple: two people, a quiet place to sit and talk, and a video camera to record the session. Still, getting the most out of an interview requires careful planning and a thoughtful technique. Here I share some insights that guide my own approach to interviewing.
Interviewing is often preferred over other self-report measures, such as questionnaires, when the goal is to understand issues that are difficult for people to articulate in a concise response (e.g., “How do you define effective teamwork?”). In my experience, verbal responses are typically longer, more spontaneous, and include more specific examples than written ones. The flexibility of the give-and-take interview format allows the researcher to probe more deeply on issues that resonate and follow-up with additional questions to fill in missing details. Interviews are especially useful as a supplement to quantitative measures, such as preference ratings: The “whys” behind ratings are usually much more informative than the ratings themselves.
We typically incorporate some form of interviewing into all of our user studies. In usability studies, pre-session interviews allow us to gather background information about users to provide a frame of reference for interpreting study results, and post-session interviews provide a forum for participants to reflect on the system they just tested. In user research, contextual interviews pair one-on-one interviews with field observations to generate insights about how users function in their natural environments, and what goals, priorities, and perspectives they bring to key tasks.
There are some kinds of issues that interviews are not well suited to capturing. For example, specific usability issues are best documented with detailed observational data, as users are not always able to report on and interpret patterns in their own behavior – for example, they may fail to notice errors or inefficiencies in their actions.
The researcher’s role in an interview goes beyond reading out a list of questions. A thoughtful interviewer:
I think about data analysis as both a “top-down” and a “bottom-up” process. In the standard top-down approach, the study’s original research questions are the starting point. The researcher “interrogates” the interview data for evidence bearing on each key question, typically categorizing participant responses by question. In the bottom-up approach, the raw interview data serve as the starting point. The goal is not to interrogate the data, but rather to listen and receive the messages the data are sending. These messages (or themes) may map on to specific research questions, or they may reflect additional findings that “grow up between the cracks” of the study. Taking both approaches means that key research questions are prioritized while still making room for new questions and issues to emerge.
Good interview data pay off in final reports, where participant quotes and video clips give substance and personality to study findings. Through quotes and video, users speak directly to product teams, in their own voices, about who they are and what they want to see.
Learning from Strangers: The Art and Method of Qualitative Interview Studies by Robert S. Weiss
Discourse Analysis: Theory and Method by James Paul Gee