One of our clients recently came to us with an issue that we immediately diagnosed as a friction problem. A product had been conceived and built which should have been selling well, but uptake was slow. There was nothing functionally wrong with the product; it did what it said it did, and it filled an unmet need in the market. However, the product and its associated distribution had a lot of rough edges. This wasn’t helped by the fact that the product was also a service like so many things are today. Not only did the customer need to purchase and configure the product, but they also had to continually interface with the company and its distribution partners to “refill” it.
Complexities in the customer lifecycle from buying, using, upgrading and maintaining the product led to slow adoption.
First, the product telegraphed complexity. Features were touted on the front page that didn’t scream “problem-solving.” It was unclear how the product could be purchased. The product, copy, and imagery was totally boring. But perhaps most damningly, it was unclear exactly how I would use the product.
We recommended that instead of attempting to attack these prima facie problems one-by-one, they instead take another step back and watch the customers attempt to buy, use, upgrade, and maintain the product. At first, the client didn’t want to do “more research”—they’d already done lots of focus groups, quantitative surveys, market sizing, and the like. But I explained that what we were going to do was simply watch. The technical term for this kind of research is ethnography; it arises from ethnographic studies of cultures.
Anthropologists embed themselves for months or years with cultures, and do not ask questions or interfere. Their goal is not to pollute the research with questions. This can be hard for business people and marketers—they want to know the answer! However, by asking customers direct questions, too often acquiescence bias distorts the results. Acquiescence bias is well known to salespeople, and memorialized as the phrase “buyers are liars.” Human beings are nice, and they don’t want to make a researcher or a salesperson feel bad, so they will tell that person what they think they want to hear. They will tell you how they should or would like to use a product, or buy a product. They might not even know they aren’t presenting an accurate accounting of the issue.
I told the client that by going to our client (and prospect) offices, across a wide range of industries, and watching them do their jobs and attempt to purchase our product, the friction points would become clear. The skepticism was palpable; “not actionable,” “high risk,” “unclear what we’ll get out of this.” I made the point that several million dollars of sales and renewals had already been lost due to friction. We embarked on the research.
1) An ethnographic study uncovers friction points.
The most important thing we did in this project was selecting the companies for observation. We did thirty 2-hour observations over several weeks. We split these into 15 companies that had purchased the product, and 15 that had not. In each group, we recruited five different industries. All of the companies had between 25 and 250 employees—the target segment for the product. We recruited the companies directly using LinkedIn. We targeted three cities. So, in each city we had five industries, with one company in each that had the product, and five that hadn’t. The screener ensured that the company and the user were good fits—we wanted companies with a lively culture, a clear fit for the product, and a good primary research subject who was engaging and fairly outgoing. Yes, this biases the research a bit, but ethnographies don’t work with reticent introverts and dull companies.
In preparation for our visit, we asked that key users of this solution focus on this task for the two hours that we were there. This was the extent of our pre-visit meddling; we didn’t ask them to do anything else. We didn’t ask them to use a specific product or anything like that. We didn’t ask them to do any homework.
The day of the visit, we arrived with a team of three people, no more: one client representative, a facilitator, and a videographer. Upon arrival, we simply asked the individual to start doing the task that the product was meant to enable. Every once in a while, we’d ask a question. This went on for about an hour. We then asked the individuals who had the product to attempt a “refill,” and watched them. For those who did not have the product, we asked them to shop for the product. Other than that, we didn’t interfere. This can be hard for the client. All they want to do is ask questions, but this isn’t the point. The point is to observe, in the customer’s “native habitat.” Over the last 30 minutes, it was time to ask questions. Why did you do what you did? What else did we miss? Can we see something we didn’t see?
While at the company, we did a lot of looking at other things, too. How were the supplies that the product was meant to complement/replace stored? What applications did the customer and various influencers use? How was the business laid out? What catalogs were on the desk?
Between sessions, we worked together to outline the sources of go-to-market friction we noticed. Because we had many breaks between visits, our picture of the sources of the friction became very detailed. “Did you notice that she couldn’t remember that URL?” “It took him ten minutes to figure out how to scan that contract.” “She couldn’t find any of those emails.” Etc. Whatever the product or service is, no detail is too small.
At this point, we had the “voice of the customer,” but we had more than that; we had lived in the customers’ shoes for 60 hours. There is a huge difference. Steve Jobs famously said, “no one ever knew they needed an iPod.” However, Steve Jobs was a careful observer of people. He built a product no one knew they needed by observing people living their lives.
2) Friction points are prioritized, categorized and agily dissolved.
When we completed the research, we had a list of over 200 sources of friction between new customer acquisition and account retention (in this case, the ongoing replenishment of services required to operate the product.) We first synthesized this list down into around 100 mutually exclusive items. We then grouped the items by theme:
- Learning / Research
- Purchase Mechanics
- Buyer / Influencer Mechanics
- Competitors for Time
- Logistics / Fulfillment
At this point, we transitioned to the mode of an agile product team. We now had roughly 100 rough edges that needed sanding down. Instead of throwing the go-to-market process out, we instead started hacking away at friction points in small sprints.
One of the most serious problems we identified was in finding the web page for product replenishment, and once getting there, remembering credentials. Several individuals we observed searched for the site for almost a minute, and did a “forgot password” reset every time they got there. This made them visibly irritated to go to the site and was clearly an emotional deterrent to continued usage. We hit that problem first.
Another key observation was a clear emotional attachment to a “competing” product (not really a product—just a way of doing things.) The client’s product caused the customer to have to abandon another process that they found rewarding. We needed to replace this sense of emotional reward. Harder, but doable. And so on, and so on.
Increased annual retention rate by 5 points.
Nothing changes overnight, but some of the first changes enacted—after prioritizing friction points—led to startling results. Fixing the password and website memorability problem drove password resets down by over 50%, and logins up by 10%. This led to an increase in annual retention rate by around 5 points.
The emotional fix was harder to measure, but we heard good things from the sales force, and retention has steadily increased in the approximately one year since the research completed. The best part is, the changes keep coming, as the insights surfaced are still being hit by the go-to-market “agile product team.”
At the company, this has led to a new way to think about the role of marketing, sales, and product. Instead of working in silos, teams have started seeing things holistically from the eyes of the customer, and imagining “how this would feel,” vs. doing things to change a number.
Watch your customers (and prospects).
At MarketBridge, we talk about “outside kids” and “inside kids.” “Inside kids” are content to sit at their desks and play with data. Data is great; I am a data scientist. However, there is no substitute for seeing things with your own eyes. Qualitative research is sometimes disparaged by arrogant direct marketers; I’ve heard many of these comments. And yet, Google spends tens of thousands of hours watching people interact with one screen. Do you think they might know something others don’t?