You have nowhere to go. I am here to protect you.
Forty-six years ago, George Lucas directed his first film. No, not Star Wars, but a futuristic, dystopian thriller titled THX-1138. There is a scene in which a chrome-faced robot is pursuing the protagonist declaring: “You have nowhere to go. I am here to protect you.” The high-tension scene that follows (The White Void) was playing in my head as I observed my first structured usability test.
We used an outsourced usability testing company that prepared a cold, white, sterile test lab that looked like a raised-floor server room, maybe left-over from previous tenants. There was an observation gallery with a one-way mirror complete with stadium seating, which gave everyone in the audience an unobstructed view of the execution … ‘er, I mean user test.
The user test subjects were nervous - they always are when you bring them into a lab environment. Not nervous as in ‘fearing for my safety’, but nervous as in ‘I am being evaluated’… imagine test anxiety. Not ideal if you want to really know what they are thinking as you walk them through a proctored test sequence.
If you’ve never seen THX-1138, I highly recommend it to gain an appreciation of how much science fiction has changed from early modern examples to, say, Rogue One. Similarly, over my twenty years of experience of user testing and observation, the practice has changed from a science-like endeavor, conducted with labs and one-way mirrors or closed-circuit TV, to something more akin to a sales and marketing exercise. For many UX practitioners, dedicated user test labs have been replaced with on-line tools for continuous user tracking, video conferencing and on-line surveys for broad coverage of the user base.
Tools like mixpanel or SurveyMonkey provide insight into what end-users are doing within your app or service. These tools may even help gauge overall customer satisfaction, but that’s the not the critical part of the story. A well-executed user study will drive impactful user experience design (UXD), leading to beautiful things that engage your customers and create competitive advantage.
It’s The User, Stupid
I had some time between client projects and spent some of it reading blogs on all things UX. There is a lot of emphasis on user interface componentry and animation, especially on tools and techniques. Some of it is neat stuff, like this, that and the other thing as representative examples. Other UX writing … makes me shake my head.
As a reminder: the first word in UX is U-S-E-R (/ˈjuː.zɚ/).
We are designing for, and testing against, a product’s or service’s unique audience. Empirical discovery is the only effective way to get to know your audience, and to characterize the people who will use your product or service, you need to go out and meet them … in the field. There are no users underneath those Post-It® notes on your team’s whiteboard.
What Happens In The Lab, Stays In The Lab
My most memorable moments as a UX practitioner happened in the field, while either interviewing customers pre-design or observing user test subjects, post prototype or implementation.
Here are some examples of what I’ve observed in the field that don’t happen in lab environments:
- Subject opens a box, glances at the quick start guide, laughs and throws the placard on the floor.
- Subject calls a co-worker and asks: “…do you know how this works or how to set this thing up?”
- Proctor is bitten by a test subject’s dog during the interview (yeah, this happened to me).
- Subject’s small child participates in the user test (see photo below; most stressful test I’ve ever experienced).
- Subject glares at you with a furrowed brow and tells you “…this fucking system makes me feel stupid …”
I tell my clients that usability is like a ledger: there are credits and debits. Good experiences, especially unexpectedly positive ones (e.g. Easter eggs) can add to the credit column. Bad experiences, like something crashing or chaotic visual layout, take away in the debit column. When the test subject is operating in the red, so to speak, they will tend to blame the product or service for elements that don’t work as expected or are confusing. Conversely, when operating in the black, test subjects tend to assign blame to themselves when the app behaves in an unexpected manner – Stockholm syndrome, if you will. Too far into the red and the test is effectively over – and so is your app.
Here’s the important point: in a lab setting, “…this fucking system…”, will translate into: “I must have done something wrong” or “let me try something different”. Both are meaningful bits of feedback, but leave observers with the impression that the test subject is still in the black when in fact, they are deep in the red.
By the way, I thought about redacting the “…fucking…” part from that last anecdote but, to be honest, when the test subject blurted out that line, during a user study, I was so caught off-guard that I considered ending the interview. This guy was that unhappy and I struggled to bring the interview back on-track. Finally, I abandoned the outline and just let him rant; I wish I had that one on video.
On a more recent project, designing and user testing a social media app, I thought I was being clever with an in-house environment we put together for user testing. The idea was to bring subjects into a conference room with a large video panel, open wireless network, relaxed lighting and quiet surroundings.
Then, after establishing a rapport with the interview subject, start leading them through a sequence of pseudo-tasks and see how things unfold with the app being tested.
Technically, it was OK as we had the user broadcast their phone screen onto the video panel behind them; Airplay for iOS devices and Miracast for Android. We used a standard video camera to capture both the phone interactions and the subject’s voice and body language, et al. And it was fun for the team and the subjects.
But, it was still a lab environment – not their home, workplace, coffee shop, whatever. After a couple of these, I started wondering about the feedback we were collecting. One of our test subjects was recruited through a visual designer on the team – friends and family, right?
So, after that person went through the test process, I asked the designer to circle back and find out what the test subject really thought of what she’d just experienced. Her candid feedback wasn’t night-and-day different, but a transition from black to red.
Six Tips from my Experiences in the Field
Here’s a short list of things to consider before embarking on your own user observations in the field.
1. Observe Subjects One-on-One
Avoid bringing a posse to user tests. Bring one assistant to help with the video camera and to observe and take notes. If the organization that is sponsoring the test is new to this sort of thing, several people will want to attend (“I just want to watch, I won’t say a word”). Stakeholders will see something they don’t like and start speaking directly to the interview subject. Soon, there is little time spent observing the test subjects using the app or product and the session mutates into a marketing focus group. I’ve seen this happen and the “feedback” is of little value.
As a corollary, and as any marketing or ad executive will tell you, avoid group interviews or focus groups. Looking at my raw notes from a pre-observation group interview, conducted over the phone with a large group of end-users, I noticed that the agitated user test subject from my anecdote (“…this fucking system…”) was self-described as an expert in using the current system. Not surprisingly, this same individual was dominant in the group conversation.
2. Learn From Other People’s Experiences
One test subject was fun to observe; she was one of the most proficient users of an app/system observed over a two-week period. She was very fluid with her computer, effortlessly switching between applications. She has a background doing paralegal work, which was obvious watching her work. It’s important to ask about and take note of the background of test subjects. It gives context to help understand what you are observing. I sometimes refer to this in my notes as to whether a subject has appropriate DNA for the task.
Similarly, if an interview subject wants to go off-course and show you something, let them. I’m often surprised during user tests or interviews when a subject tells or shows me something that had never occurred to me – this is when eureka design moments happen.
3. Record Your Thoughts, ASAP
Make sure to write up your raw notes as soon as possible after the interview and/or observation, even if you recorded the session with video. There is no substitute for putting your inner thoughts or observations down on paper before they are lost to time or smeared by a follow-on observation.
4. Be a Good Observer
Don’t try to analyze at the same moment you are listening to commentary from test subjects – really LISTEN and focus on understanding what the subject may be trying to tell you that isn’t coming out clearly (I have a blog post on this topic alone: 5 TRAITS SHARED BY EFFECTIVE UX DESIGNERS & SALES PEOPLE).
5. User Testing ¹ Transplanting Kidneys
Don't be dogmatic when user testing. Keep yourself and the interview subject relaxed and having a good time – you’re not performing life-saving surgery, so don’t take any of it too seriously. If a test subject gets stuck, just help them along. Make note of where things went poorly and move on.
6. You Don’t Work For Free; Your Interview Subjects Shouldn’t Either
Pay the subject, somehow. Remember that you are asking an interview or test subject to take time out from their busy day to try to use your app/system/product and provide meaningful feedback. I have found that a $25.00 Visa gift card is enough to get test subjects to endure just about anything – even me whining about their dog nipping at my leg.
I once conducted a user study that included interviewing health care practitioners who, themselves, interviewed others as part of their job. Their role was to follow up with patients who had undergone medical treatment of some kind to make sure the patient was following their health care directives and to assess their progress.
The breakthrough for me in that project was listening to commentary from those Registered Nurses on how important it was to keep the conversation friendly, natural, supportive, smooth and on-point. Their interviews are conducted over the phone, not in-person; nonetheless, every RN told me that patients could sense when the interviewer was ‘reading from a script’ or simply following a checklist. When that happens, patients often become nervous and hesitant, they stop proactively offering additional information and just tersely answer with a short yes or no. That realization broke ground for the application’s main success scenario in the design work that followed.
Those RNs have a tough job and I am forever indebted to them for teaching me more about UX interviewing and observing in those two weeks than in the years that came before.
Now, grab your video camera, a tripod, a clipboard and …
If you enjoyed the post, please click the thumbs up icon and let me know!
Ken Krutsch is Managing Principal of KRUTSCH Associates, a digital strategy and design firm, specializing in product vision and realization, including customer research, roadmap development and full-stack user experience design.