top of page

Analyzing User-Feature Interactions

I created this research process to fill a crucial gap in the product cycle at Wave, which was that we did not have a system for measuring the impact of a new feature on target users. Without space in the product cycle to measure impact, product teams had limited understanding of the success or failure of each release.

Background and Goals

Previous to the creation of this post-release process, our product teams had no easy way of knowing how and where to improve upon a feature, which meant that features would either remain as they were at the time of release or improvements were chosen based on minimal data.

Strategic Objectives:

  • Standardize the post-release process across the company and across features in order to measure impact reliably and consistently

  • Give designers, product managers and engineers tangible and quantifiable insights and recommendations on next steps for feature iteration

Method

The major stakeholders in this project were product managers, designers and engineers. While these teams work together to create each product, what they need from a post-release feature analysis varies. In order to understand each group, I conducted 1:1 interviews with each stakeholder, then synthesized my notes from each one to come up with a shortlist of crucial questions from each group of stakeholders:

Designers:

  • Is the interface intuitive?

  • Are users getting stuck anywhere when navigating the feature?

  • What would help them navigate through this easier?

Engineers:

  • Are there any bugs in the feature?

  • Are users interacting with this feature in the way we intended?

  • Did we build this feature in a way that users would expect?

Product Managers:

  • Are users interacting with the feature?

  • How can we improve on this feature in the future?

  • What does success look like?

From there, I narrowed down these questions into three important questions to answer each time a post-release analysis is conducted:

The big 3 questions:

  1. Is this feature useful to our target users?

  2. Is this feature frustrating to users in any way?

  3. Are there any ways to improve upon this feature for our target users?

These three questions capture the essence of what is important for product managers, designers and engineers, and form the foundation of the post feature release process.

 

A mix of customer feedback and feature interaction data is used to answer the 3 important questions in the post-release analysis:

Crucial Insights

Qualitative + quantitative = ❤️

I knew after conducting interviews with all stakeholders that the post-release process required intentional integration of both quantitative and qualitative research, where both were equally important to our understanding of the release.

Qualitative:

​To understand how users feel and think about a new feature, I comb through customer service chats to find the ones pertaining to the new feature release, and highlight important themes and insights that were derived from what our users said to our customer service agents.

​If needed, I reach out to a subset of users who interacted with the feature to better understand the needs and motivations behind their interactions with the feature.

Quantitative:

To size opportunities and problems with a new feature, ​I categorize customer service chats into broad buckets to better understand the types of conversations users are having with customer service, and how frequently they are occurring. This helps to flag potential bugs or major pain points in their usage of the feature.

 

I then analyze user interactions with the feature to understand how often users are interacting with the feature, what pathways they are taking, and whether our in-app nudges are working to get users to engage with the feature.

Research Impact

Product Impact:

  • Post-release analysis has been useful in analyzing the effectiveness of certain in-app nudges. We have discovered that adding a "hot spot" in the location on the website where a user performs an undesirable action helps redirect users towards the new feature, and is successful in changing user behaviour.

  • The post-release analysis of our invoicing "add a discount" feature revealed that while many users liked the feature as-is, a large minority wanted more flexibility in how discounts can be applied to an invoice, which led to further investigation on ways to create more customization.

  • The post-release analysis of our automated duplicate merging engine revealed that the engine is not merging enough duplicates to be seen as useful to our users, which led to discovery work around the creation of a “suggested merge” modal.​

Strategic Impact:

  • Having a consistent process across features and product teams allows us to build a repository of feature impact, and gauge which feature improvements will have the most impact for our users.

  • Conducting stakeholder interviews led to a better mental model of what designers, engineers and product managers want from their collaboration with UXR, and how their needs differ from one another.

My Learnings

  • Rigour, not rigidity: it is important to have a skeleton framework, but to allow for flexibility given the wide variety of features Wave offers

  • Explaining the reason behind the process upfront and early on creates buy in from parties with less friction

bottom of page