Back to Methods page

Table of Contents

Contributors

Kelsea Mann

Tiffany Nolan

Amber Asaro

Resources

Additional tags

No tags are assigned to this entry.

Product analytics

Author: Kevin Hatchoua | Last edit: May 01, 2024

What is product analytics?

Product analytics is a method used to collect, measure, and analyze user data within a digital product or service. It involves utilizing various tools and technologies to gather quantitative and qualitative information about user interactions, behaviors, and preferences.

This method aids in understanding how users engage with a product, identify areas for improvement, and make data-driven decisions to enhance the overall user experience.


Why use product analytics?

By analyzing data on user paths, feature usage, and performance metrics, it becomes possible to identify pain points, usability issues, and areas for enhancement. This method helps teams align their design strategies with user expectations, leading to a more user-centered and overall delightful experience.


When to use product analytics

When possible, product analytics should be implemented and used throughout the product development lifecycle. It is especially beneficial during the early stages of design to validate assumptions and identify user needs. Furthermore, it proves invaluable post-launch for monitoring product performance, evaluating feature adoption, and iterating based on user feedback. It should ideally be a continuous process, providing ongoing insights to support iterative design and development.

 

  1. Define goals and metrics: Begin by establishing clear objectives for data collection. Determine key performance indicators (KPIs) aligned with user experience goals, such as conversion rates, user engagement metrics, or task completion rates.
     
  2. Select tools and set up analytics: Choose appropriate analytics tools based on the defined goals. Implement tracking codes or integrations within the product to collect relevant data. Popular tools include Google Analytics, Mixpanel, or Adobe Analytics.
     
  3. Collect and analyze data: Continuously collect user data and perform regular analyses. Utilize data visualization techniques to comprehend trends, patterns, and user behaviors. Identify actionable insights that inform design decisions.
     
  4. Iterate and improve: Implement changes based on insights gained from analytics. Test design iterations to validate improvements. Monitor the impact of changes through ongoing data analysis.

Examples

Leveraging Product Analytics for ROSA (Red Hat OpenShift Service on AWS)

Background:

As the product team supporting ROSA, our primary objective is to optimize user experience. As such, we established four critical outcome goals that focus on improving said experience and easing adoption.

  • Outcome #1: Discover ROSA's value to ensure users easily discover the value of ROSA, our goal is to track how users navigate the ROSA homepage, view feature descriptions, and access introductory materials.
     
  • Outcome #2: Streamlined onboarding facilitating an easy start for users involves tracking the onboarding flow, monitoring how users engage with setup wizards, and gathering feedback on initial user interactions.
     
  • Outcome #3: Simplified cluster creation reducing the effort required for users to create a ROSA cluster involves tracking the steps involved in the cluster creation process and identifying pain points or bottlenecks.
     
  • Outcome #4: Seamless workload deployment for this outcome, we aim to monitor user interactions within the workload deployment interface, tracking actions taken to launch workloads onto ROSA clusters.

Our process:

Alignment of UX Goals with Business KPIs: Collaborated with various teams to ensure that our UX outcomes directly align with the business KPIs established for ROSA.

Analytics Requirement Documentation: Partnered with Engineering and other stakeholders to identify key UI elements to track events on, leading to the creation of a comprehensive analytics requirement document.

Tool Selection: After careful consideration, we chose to combine Amplitude, Segment, and Intercom to cover the diverse analytics needs of ROSA. 

  • Amplitude for in-depth product analytics.
  • Segment for data integration and routing.
  • Intercom for gathering qualitative user feedback.

 

Regular Progress Review:
Conducted regular meetings to review the progress of our outcomes, and assess the impact of implemented designs. Monitored metrics related to Sign-ups, Activation Rate, Retention Rate, and vCPUs Growth, drawing insights to inform next decisions.

Impact and Progress:
Our analytics-driven approach using Amplitude, Segment, and Intercom revealed a notable spike in clickthrough rates, indicating increased interest from top-of-funnel users, leading to more in-product interactions.

Challenging our assumptions, we pivoted to deliver solutions that meet user needs. The ROSA Hands-On Experience, offering risk-free OpenShift exploration, emerged from this shift.

Identifying data gaps was pivotal. Tracking enabled us to address incomplete data collection methods, ensuring comprehensive analytics for an improved user experience.

Next Steps:
Continuously iterate and improve the user experience by leveraging insights gathered from our analytics tools. Focus on addressing remaining pain points in the workload deployment process to further enhance user satisfaction, retention, and expansion rates.


methods process

Get in Touch!

Spotted a typo? Any feedback you want to share? Do you want to collaborate? Get in touch with the UXD Methods working group using one of the links below!

Drop us a message on the #uxd-hub Slack channel

Submit Feedback