How do you work with professional services, consulting, field engineers, etc. to make your product better? Do you just treat their inputs as yet another channel for feature requests, or do you engage them as an incredibly potent market-sensing capability?Conversation Starter
I received an excellent and insightful question from one of my former students in DIT’s product management degree program (enrollment for the next cohort closes in a month). This student is now a VP of product, and kicked off a conversation with me about best practices for establishing a workflow for product managers to collaborate with professional services teams to improve the product. I’ve seen several companies try different ways to make this work, with one consistent attribute that described all of the approaches – not-visibly-expensive.
Two nights ago I was chatting with another colleague about how his team has been tasked with delivering a set of features, and not a solution to the underlying problem. As a result, he’s concerned about potential mis-investment of resources and the possibility of not genuinely solving the problem once the team is done with their tasks.
Combining the two conversations, I realized that there’s a common theme. When I look at how I’ve engaged with professional services folks, I found I’ve had success with a particular approach (which would also help my colleague).
First, let’s unpack a couple typical ways I’ve seen companies engage “the field” to get market data, and think through why a different approach could be better.Just Ingest
One team I worked with managed their product creation process (discover, design, develop) within Atlassian’s Confluence (wiki) and JIRA (ticketing) systems. Product managers and owners would manage the backlog items as JIRA tickets. Bugs were submitted as JIRA tickets, and triaged alongside feature requests. There was a place where anyone (deployment engineers, for example) could submit feature requests based on what they were seeing on-site at customers. Product managers would then “go fishing” within that pool of tickets looking for the next big idea. This process did not have a lot of visible overhead, but suffered from a “throw it over the wall” dynamic, a lack of collaboration, and a well-established pattern (not just a risk) of good ideas lying fallow in the “pool” waiting to be discovered, evaluated, and implemented.
From the product team’s perspective, going fishing was looking for needles in a haystack. The cognitive effort required to parse through low-value tickets and duplicates shifts your thinking to where it is challenging to apply critical thinking to any given idea. Therefore in addition to good ideas that were never discovered, many were touched but passed over.
This is certainly better than “no information from the field” but it emphasizes data and minimizes insight.High Fidelity Connections
One team I worked with had a product owner who formerly worked as a field support engineer. This product owner reached out to her colleagues in the field regularly both socially (cultivating her network, and maintaining genuine connections with friends) and professionally – asking about trends, keeping her experience “current by proxy” as she realized her direct experience would grow stale with time.
This narrow-aperture channel was very high fidelity, but low in volume and limited in breadth of coverage.
Each idea that came in received thoughtful consideration, and the good ones informed product decisions. The weakness of this approach was lack of scale; it suffered from the danger of extrapolating “market sensing” from a narrow view of a subset of the market. Because this “just happened” within the way the product owner did her work, it appeared to accounting to be “free.” Many good ideas were missed, presumably, because they didn’t happen to come to the attention of this product owner’s network.
I put this in the bucket of good (and better than just ingesting), but still falling short of the objective of a product manager.A product manager’s goal is to develop market insights, not collect market data.
- The first approach, while easy to institutionalize, had so much noise that you couldn’t find the signals.
- The second approach had a great (data) signal-to-noise ratio, but the signal was constrained by limited bandwidth, and only worked because of the product manager’s unique background, approach, and interpersonal skills.
There is another approach I’ve worked with a few teams to effectively generate product insights based on real-world insights from professional services team members. The challenge with this approach is that the expense is visible – you’re pulling people out of the field for a day or two, taking them off their accounts for a day or two. On one sufficiently high-profile project, a workshop date was scheduled (~5 weeks in advance) and people were “told” to come. They planned travel, manged customer commitments, etc. We booked a large room for two days and rolled up our sleeves. On another project, we opportunistically scheduled a half-day session the day after an all-hands quarterly meeting that brought everyone into the office anyway. The cost of the “extra day” was a lot lower than the cost of a standalone event. I’ve run two types of workshops that were very effective for this. The first one frames problems in a broader context, and the second one really explores alternatives and opportunities in a more targeted exercise. Ironically, the tighter targeting leverages divergent thinking as well as convergent thinking, and the broader framing is purely convergent. The first workshop is a co-opted customer journey mapping exercise. I say co-opted because while I go through very many of the same steps, I am not attempting to improve the experience, I’m attempting to understand the nature of, and relative importance of solving, the problems a customer faces through the course of doing what they do while interacting with our product. Without going into the specifics of running the workshop, the high level looks like the following:
- Start out with a straw-man of what you believe the customer’s journey looks like – a storyboard is a good tool for making a visceral, engaging touchpoint for each step in the journey. Review with the team and update the steps (add missing steps, re-order as appropriate, remove irrelevant and tag optional steps).
- [Might not be needed, but worked when I did it] Start out with key personas identified, representing the customers for whom we are building product. Workshop participants will be capturing their perspectives on the relative importance of problems from the point of view of those personas.
- Within each step, elicit from the field all of the problems a customer faces within each of those steps.
- Have the participants in the workshop prioritize the relative importance of each problem within each step (the 20/20 innovation game works great for this)
- Have the participants prioritize the relative importance of “improving any particular step” relative to improving any other step. (Fibonacci story-pointing works well for this)
- Record / take notes of the conversations – particularly the discussions where the participants are arguing about relative priority / relative importance. Those conversations will uncover significant learnings that influence your thinking, and establish focused questions to which you will want answers later. Before the workshop, you didn’t know which questions you needed to ask.
Professional services folks have massive amounts of customer data and insight – they only lack the (product management) skills to transform that insight into something usable by a product team.
The best way I’ve found to get value from that insight, in a repeatable way across teams and individuals, is to incorporate running workshops that force teams to articulate what the customers are doing with the product (what are their goals and challenges).
When asking the questions this way, you get the answers you need. By doing it in a collaborative workshop, you get more and better contributions from each of the team members than you would get through a series of interviews.