Resources And FAQs From Dublin Tech Talks June 2023
Updated: Aug 2
Slides, How-To Articles, And FAQ Responses
I led a session at Dublin Tech Talks on 21st June 2023. Several attendees engaged with me and the content during and after the presentation. I loved having follow-up discussions with professionals from different fields. I promised written answers to a few questions. Here they are.
I’ll write my learnings from the experience in another post.
Slide Deck PDF
You can download a PDF of the slide deck here.
Although I did not record the presentation at Dublin Tech Talks, I recorded an older version of this presentation at Userpilot’s Product Drive Summit.
Writing Things Down: In-Depth Guidance
The first case study focused on measuring customer experience. Here is a more detailed explanation: How to measure customer experience by merging product analytics, surveys, and interviews.
The second case study focused on measuring success by finding leading indicators. Here is a more detailed explanation: How to use metrics and signals to track customer experience.
The third case study showed the importance of process redesign to improve customer experience. I do not have a more detailed write-up for it yet.
The fourth case study showed the usage of a checklist to boost the success of your launch. Here is a detailed explanation, including some templates: How To Successfully Communicate With Your B2B Customers In 5 Steps.
Discussions With Attendees
Q1 - How do you prioritize features? How to prioritize the right problem to solve?
There are two steps I use to find the priority of a feature or customer problem. I combine the Bottom-Up and Top-Down approaches. See illustration from Smartsheet.
Review your customer interviews, support requests, and product analytics to identify the quick wins and big problems in the Bottom-Up approach. Group into different themes. Arrange the roadmap items based on impact within each group. I created a video about impact assessment as a PM skill here. But how much time do you spend quantifying every problem every planning cycle? How can you have more frequent planning cycles to work in an Agile manner while not spending all day every day in planning quantification?
The Top-Down approach is the unglamorous part of working in an organization. Your leadership team, one or several levels above you, prepares a high-level approach for the organization. You prioritize items on your roadmap to move those goals. Those are goals given to your team. You follow these goals even if your customer research did not highlight the same.
Merging these two approaches helps you decide on the themes of problems to prioritize. With the merger, you can rely on understanding customer needs to prioritize roadmap items. You likely have subjectivity in your decision because some items improve short-term revenue, some long-term exponential growth, some retention, and others acquisition.
Q2 - How to find the data to quantify the profit impact of all roadmap items? How can we compare them against each other?
I’ll continue the previous response.
The most popular PM books will tell you that a principled Product Manager will always look at customer problems, make data-driven decisions, and not listen to mandates from other business teams. I mentioned the books that influenced me in this article, this webinar, this analysis, and this curation. I have worked towards this ideal world. It needed nights and weekends, it couldn’t be done in 40hrs a week. I have also realized it is not the real world.
Yes, in an ideal world, you want to compare all ideas in a data-driven approach against each other. Yes, you want to keep your customers in mind. And only customers in mind.
The ideal way to compare any two ideas is their NPV impact. How much investment do you need to make? Over how many months? How much money will you make out of this? Will the revenue be in the short term or the long term? These can quantify any business decision as a number and bring all decisions onto a 1D axis. However, the cost of calculating this number is very expensive.
So let's think of mental shortcuts.
For example, group all items which improve acquisition in one bucket. Look at the impact of acquisition changes from each and the cost to build each. What is the likelihood that each of these will succeed versus fail? What is the mix of low-risk and high-risk items you want on your road map?
You can similarly group items that improve your retention, enable short-term revenue through sales deals, or make it easier to build and maintain your product.
This is the way of a pragmatic Product Manager. You have the best intentions, but your work is not a fairy tale.
Q3 - What is the ownership of the product manager? Is it the complete success of a product, or is it building and launching a feature?
The best books for principled product managers tell you a Product Manager owns the end-to-end success of a product. The product manager is the CEO of the product.
Yes, a PM should keep the E2E product lifecycle in mind when building the product. But, many colleagues (who are just as talented at their roles as the Product Manager) own other parts of the business. If you own the E2E of a product, how do you divide ownership and responsibilities in a modern organization with marketing, design, sales, or support counterparts?
Amazon and other companies conceptualize this into input and output metrics. I’ve written more about it here. You can also use the RACI framework for this question. Discuss with your cross-functional colleagues and decide on the north star goals. Decide on the output metrics you want to influence. Then identify the input metrics under your control.
You can only be accountable for things you can control.
Q4 - Do you consider a launch a success if you have built and launched a feature, but nobody is using it?
Let’s assume the purpose of the launch is not an experiment but to drive adoption. The launch of a feature was not successful if no one used it.
Which teams work together to make the launch a success? Do sales need to evangelize the feature with some customer segments? Does the sales strategy team need to change the sales commission structure to align their incentives around the new product? Does the product marketing team need to create sales collateral to enable the sales team? Does a technical writer need to write support documentation?
As you can see, a simple question can get complicated when you have an organization with functions split across teams.
If you have agreed that the engineering team’s success for this sprint is launching the feature, then it is a success for their team in this timeframe. Even if no one used it.
Q5 - How can we use new advancements in LLM to process customer feedback?
I presented a few examples of reading 1,000+ customer emails to understand customer problems, prioritize, and solve the right ones. In the right way. You can read more here.
My colleagues and I read each support request email thread and categorized each into one of 160 new categories of customer problems.
Can we automate this using LLM tools like ChatGPT or Hugging Face?
Yes and no.
2 words - training data.
If you use an LLM, it might classify support tickets, survey text responses, and forum posts into happy-sad, feature-bug, or desire-need groupings. But can you use it as-is? Which team should look at which group of information? Many customers report bugs. Many will report features. But, how do you realize whether the bugs are related to your product feature or another team’s features?
So, no, an LLM cannot do customer research for you.
To train an LLM on the text from your users, you will need to label training data. This might be 1000s of tickets with labeling. Does your support team label each ticket as they solve it? Does each forum thread get tagged to a flair?
But, here is a process you can manually do with an LLM.
Read 100 tickets and tag them to individual categories. Create the taxonomy on the go.
Use an LLM to extract the key attributes of each category.
Review the attributes. Edit manually as required.
Ask the LLM to use the edited attributes to tag 500 more tickets.
Review several of the tagged tickets. Create new categories. Redo steps 2 to 5.
Create a hierarchy of taxonomy based on your organizational structure.
Ask the LLM to tag your taxonomy to this hierarchy or create one based on the org structure. Ask it to group the categories into super-categories.
Review. Edit the grouping as required.
Now you have a system built using LLM you could use to triage incoming tickets. I experimented with a smaller version of the above steps to synthesize 40 expert interviews here. I’ll write about that in an upcoming article.
You can read another expert perspective from a B2B SaaS Customer Experience Manager here.
Debrief: 3 Stages To The Speaking Engagement
I am reusing the 3 stages described in a previous article here.
Since this became a long article, I cover the debrief in another article.