Giving customers control of their spending

V1 of dashboard for monitoring usage and costs


Intercom’s new pricing model incorporates a usage-based component; our customers use certain features and pay based on their usage. This is measured by metrics such as emails sent, messages sent, AI bot resolutions, and others.

The problem

Despite the simplification of the pricing model compared to previous versions, customers lacked the tools to control their usage and spending effectively. They were unable to answer questions such as:

  • How is my usage trending this month?
  • When do bill spikes occur?
  • How close am I to reaching my included or contracted amount?

Without an easy way to monitor their usage's impact on their bill before receiving an invoice, customers would only discover their spending upon invoice receipt. This led to them contacting our support teams, consuming valuable time as each case required investigation. This experience was frustrating for customers and burdensome for our internal teams.

Moreover, this lack of visibility hindered customers' ability to budget accurately and take corrective actions when necessary.

Alignment with company strategy

In order to align with one of the company’s strategic initiatives of delivering an outstanding billing experience with transparency at its core, our goal was to create a set of tools that not only provide users with insightful information but also empowers them to make informed decisions about their usage and optimise costs effectively.

The solution

A comprehensive view, starting with a dashboard, of their usage across different metrics that facilitates the monitoring of usage trends, provides full details of events contributing to that usage, and offers a breakdown of usage costs. This enables bill managers to proactively manage their usage and spending.

This solution is composed of:

  • Key usage and cost information: Provides customers with information about their current total usage and costs per metric.
  • Cost Breakdown: Offers detailed usage costs by metric, allowing users to identify every aspect of their spending, including discounts, tier pricing, and overages.
  • Usage Trends: Presents charts visualising usage trends over time within a billing cycle, enabling users to identify their usage trajectory and make decisions sooner.
  • Usage Thresholds: Allows users to see usage thresholds (contracted or included amounts), enabling them to understand how close they are to reaching them and adjust accordingly.
Usage overview V1

Definition of success

Together with the PM we defined success:

  • Adoption & engagement: Are they using it?
  • Satisfaction: Is this helpful? Does it help prevent the problem of unexpected bill increases?

I created a learning plan to measure both aspects after release.


  • Multiple metrics and scenarios: Intercom’s model includes more than 7 metrics, some of them with specific pricing, inclusions, and cost-related details that needed to be covered even for the first version. All of this needed to be accurately represented. The challenge here was to find reusable information patterns across metrics so the information is displayed in a way that is easy to understand.
  • Differences between contracted and non-contracted customers: Usage for customers with a contract (sales-led) works differently to customers who just use a metric without a contract (e.g. different discounts per metric, contracted limits, etc.). I needed to find ways to visually represent these differences.
    • I overcame these challenges by being very thorough with the different scenarios to be considered for each metric and each type of customer, including all potential edge cases. This is essential as we’re dealing with our customers’ usage and spending information.
Details of usage and costs (example for one of the 7 metrics)
  • Dependency for detailed usage data: The full version of this solution includes a full breakdown of the events that contributed to the usage of each metric so customers can make informed decisions. For example, for emails: which emails were sent, who sent them, the cost of each email, whether it will continue accruing costs, etc. However, we didn’t have the infrastructure in place to access this daily usage activity yet.
    • Because of this, we decided to release a first version without this detailed data.

The design process


I began by conducting desk research, examining existing information, surveys, past interactions with customers, and interviewing sales and support representatives. I then created a journey map to illustrate our customers’ current experience, identifying any gaps and challenges they encountered in their billing journey. I then shared this map with the team and stakeholders during our ideation workshop.

As a result of this research, I also developed a set of distinct proto-personas to represent their needs, experiences, behaviours, and goals.

Journey map

We then identified the problems to solve and prioritised addressing the lack of information needed to understand their spending, based on its impact. This challenge was also particularly significant for our support and billing teams, as they had to invest time and effort into investigating each case.
It was also important to understand the different components (actors, systems, and areas involved) that interact in this situation, as it’s a problem related to different areas of Intercom. To achieve this, I designed a system view for the team to comprehend the bigger picture. This allowed us to identify potential areas of the product that would be impacted, collaboration needed with other teams, and any data inconsistencies.

System view to understand components and relationships


  • Once we identified the problem to solve, I conducted an ideation workshop with key stakeholders from different areas (billing, product, design). From this, we were able to identify a set of potential solutions and conduct an initial impact/effort assessment.
  • I also conducted a quick competitive analysis to understand how others are addressing this problem.
  • Following this, I diverged by sketching different options. I shared these options with stakeholders (including sales teammates) gathering their feedback, and concluding with a recommended direction.
Some initial concepts


  • After we decided on a direction, I set up a series of interviews with customers to test our concept. I created a research plan, defining research questions together with stakeholders, and conducted some of the interviews myself.
  • Additionally, while the concept was taking more shape (detailed design), I conducted a series of unmoderated usability tests, iterating after a couple of rounds.
  • The goal of this research was to gather more confidence from initial concept to detailed design.

Scoping down to a V1

While designing this project, we encountered a significant dependency on the infrastructure team to obtain the data needed to display the details of what was contributing to a customer’s usage. Mid-project, we realised that we wouldn’t have the necessary usage data in time to provide a breakdown, so we shifted our focus to rethink a V1 that we could ship sooner.

Learning plan

As part of releasing this V1, I wanted to ensure we captured the impact of this solution on our customers' experience, as it aligns with the strategic initiative of improving our customers' billing experience. To achieve this, I put together a learning plan to monitor:

  • Are customers using it? (quantitative: adoption and engagement)
  • Is it solving the problem? How useful is it? (qualitative: satisfaction survey)
  • Do they have feedback? (qualitative: continuous feedback survey)

The engineering team developed a monitoring dashboard based on this learning plan.

Learning plan to monitor impact


Offering a usage overview dashboard is transforming the billing experience, providing users with actionable insights, real-time monitoring capabilities, and greater control over their usage and costs. However, further monitoring needs to be done as this was released recently in March 2024.

Next steps

  • Monitor and capture feedback to understand if the release of V1 is a success (WIP).
  • Capture qualitative feedback to identify next steps. I have high confidence that the detailed breakdown is a strong candidate for a V2 based on our initial conversations with customers.

Learnings & reflections

  • Scope down to ship sooner: Initially, we planned to build the 'full' solution starting with a single metric (a deeper approach). However, we shifted to a more horizontal approach of creating a minimum version for all metrics instead of the detailed breakdown for a single metric in the full version. In hindsight, identifying the risk of not having the data on time earlier could have allowed us to focus on designing the horizontal V1 approach to ship quicker.
  • Opportunity to improve our QA process: While we aimed to keep QA as lean as possible, we encountered many opportunities to improve our process, especially in terms of efficiency and communication. I've taken on this challenge to enhance our QA process (work-in-progress).