top of page
Search
  • Writer's pictureHarshal

How to Analyze Email Usage and Offline Customer Experience - Part 5

Updated: Apr 28, 2023

In the previous article, we looked at using a “how-to” article to inform our customers about the self-service feature and discovered a way to track customer behavior using the article’s page views as indicators.


In this article, we will look at customer interactions with our emails, and identify which links in the emails were being clicked. Then, we can decide how to take action on the CTR report from our analytics software. If you’ve missed the earlier parts - here is the introductory article.


A woman analyzing email interactions with customers and two graphs

Thumbnail credits to Designed by stories / Freepik.


Analyzing Email Activity


Emails sent by our system are often referenced or forwarded by customers to create support tickets. So, we wanted to understand how customers interact with our email notifications. I talked to my engineering team to understand the internal code utility (an email library) used by our services to send emails, then followed the trail upwards to find the engineering team who made it. From there, I found out about the email API provider we use and reached out to the vendor. That helped me get access to the analytics software of the email provider.


Data flow diagram to analyze email activity

Now that we had access to the tool, we looked at the different data points an email tracking software could provide. You can see the visual above to illustrate the steps in the life of an email.


Based on this, we got the number of clicks on the “how-to” link in the email via the analytics software. However, we didn’t know the number of clicks on the feature link. Nevertheless, we could find this out by using the email tracking software and comparing that with the number of emails sent or opened.


Graph showing customer interaction with emails

Firstly, the number of opens suggests that a lot of customers open the email multiple times. So it seems they find it easier to find the report emailed to them by searching their email than logging into the web portal.


Second, the number of clicks to the how-to or feature pages is still very low - about 2-3% of the emails opened result in a click. This means the rest of the time the customers are interacting with our product using the PDF report attached to the email instead of the web portal. Usage of a PDF report is even harder to track.


Problem — not only do customers interact with our product via email but they also rely on a static email attachment a lot. How can we understand customer behavior when using a static PDF report?


Test Customer Behavior with PDF by Using Links’ CTR


We have looked at the tool we used to analyze email activity.


Now, we want to also analyze how customers read a PDF document, which has very little CTR and interactivity possible.


I worked with my design and engineering team to do this in two steps.


five pages showing customer behavior with PDF


1 - We collated support how-to links in it and used those as a proxy. Using those links as a proxy to figure out which pages customers go to more often or less.

  • We put how-to links on different pages of the report. Some on the front page, some on the summary page, some on the details page, some on the appendices page, and so on. Screenshot above.

  • Clicks on a page = (number of customers who opened that page of the pdf) * (% customers confused by that page)

  • If we assume the (% confusion per page) is similar, then the number of clicks on that page is a proxy for the number of page views.

2 - Interviewed customers to understand their behaviors and usage patterns.

  • We worked with a product designer to build a research plan.

  • Reached out to customer success and sales colleagues from go-to-market (GTM) teams and reached out to their customers who had complaints or were vocal about the product.

  • Talked to the customers. They not only shared their problems but also answered more questions to help understand why, how, when they use the pdf, open emails, use web portal.

We used this research as a starting point to kickoff and launch a redesigned pdf report. This also helped reduce support tickets; I will skip the details of the impact of the redesigned PDF report for now and we can discuss the process, reasoning, and impact measurement for it through a dedicated article.


Problem — Although we did more analysis and had leading metrics, the lagging metrics still suggested the support team was getting many questions on this.


Saving Support Time


Since some tickets were still coming to support, I wanted to redirect those customers to the self-service feature. This way:

  1. We empower the support team to significantly reduce the time spent on these tickets and allow them to spend time on more complicated tickets.

  2. Customers who repeatedly reached out to support with the same question now will ask it once and will attempt self-service for their future needs.

But how do we do this in a scalable way for multiple functionalities, multiple types of questions, and a large support team?


To help support redirect tickets, I worked with the support team specialist, whose priority was to help the customer support team by liasoning between the support team and the product managers. We worked to create shortcuts in the tool, like macro instructions, which the support team can search easily and send over to customers with a single click. The response to the customer would include instructions on how to use the self-service feature. Instead of waiting for a few days for the process completion, a customer would have a response within a few seconds.


To enable such shortcuts, the specialist and I reviewed every new feature built in the recent few months and created macros for each variant of question that might be asked for them. After a few months of this, we reviewed the statistics.


Graph showing customer responses

Around 30% of all tickets were solved using shortcuts in the year before this as well as after all these changes. Despite creating almost 50 shortcuts, how could this be?


This probably means the support team was underutilizing the shortcuts. Or that there was a steady stream of adoption of new features by customers, which reduces the need for redirecting them to those same features and constrains the need for redirection to just the latest features at any given point.


These are some hypotheses that need further testing that I had not done back then. Let's brainstorm potential next steps that could be taken from such a finding.


Hypothesis 1 - support team is underutilizing shortcuts.

Potential tests:

  1. Can review tickets individually to see whether they are being answered by shortcuts wherever it was possible. If not, can interview support agents to understand how they are notified about newly created shortcuts and how they search for it on a per-ticket basis.

  2. Can review stats of individual shortcuts across time. Potentially, normalize the usage of shortcuts by the number of tickets for that feature.

Metric = (usage of a shortcut)/(number of tickets that could’ve been answered by the shortcut). Does this metric gradually increase for each shortcut or feature? If so, the education of the team gradually improves over the months.

Potential solutions:

  1. Regular education of support team about latest shortcuts and latest features.

  2. Suggestions in the support tool to support agents on the potential shortcuts that are relevant to a question using a Machine Learning-based (ML) recommendation engine.

Hypothesis 2 -

The adoption of new features follows a typical time delay. Hence, at any given time, customers are unaware of only recently launched features and send in questions related to only those features.

Potential tests:

  1. Review the time delay adoption of features using not just support tickets but other metrics

  2. Does the number of support tickets go down for any feature over time without any intervention, such as support shortcuts?

Potential solutions:

  1. This goes back to increasing the adoption of features sooner, which we will discuss towards the end by combining all the learnings together.

Problem — we helped reduce support time but can we move some of the auto-response to the web portal?


Error Validations


We added additional validation into the self-service feature in the web portal to provide real-time feedback to portal users for any invalid information entered. The feedback included a short description of the error and pointed to the articles to make it easier for users to read more information.

For example, earlier, three different types of errors resulted in the same error message telling the customer to contact customer support. After the changes and error message mapping, two out of these three errors gave actionable prompts to customers to fix the problem on their own. You can see an example of an actionable prompt from stackexchange here.


Image showing valid and invalid input fields

Problem — we enabled information for customers and customer support but go-to-market (GTM) teams still asked questions on behalf of customers. Customers still continued creating tickets and refused to use the self-service feature.

Another Problem — large-enterprise customers created a lot of tickets and were not open to change, or so it seemed.


Next Up…


Next, we will look at reaching out to a few large customers to convince them to login into the web portal and use the self-service feature. Is this a one-off, an experiment, or a sustainable approach to changing customer behavior?

Originally published at https://harshalpatil.substack.com on Nov 30, 2021

8 views
Post: Blog2_Post
bottom of page