In the previous post, we discussed product instrumentation i.e mapping your product to effectively measure product adoption. When you are collecting a lot of data about the usage of your application, it becomes really tricky to represent all that data in a readable format. I will try to explain how we (VWO) approached this problem. I have taken one of our products called VWO Testing – An A/B testing product as an example to explain this further.
Challenges
- Once we had mapped the entire product we had about 1000+ events that we were tracking across our products.
- We had data coming from different sources such as product mysql database which included product and business data.
- Coming up with a framework to visualize all the data.
- Selecting the right tool that could help us visualize the data and also give us maximum flexibility to customize the way we wanted to represent data.
SOLUTION
Framework
The first step to solve these challenges was to come up with a framework that could answer all our questions and that could be scalable to fit in new features. To come up with took a CRUD based approach. In most applications, a user would take some steps to create a report. For example, in our case, let us say it was “Creating an A/B Test“. Similarly, reporting and management are the other two broad areas where a user interacts with your application.
Here is how we built up our framework:
- The overall health of the product: Giving a high-level view of the overall health of the product with metrics such as
- Total number of AB Test created over time
- The average number of A/B Test created over time per account etc
- Feature details: This section would deep dive into individual features under these categories
- Creation: For the creation part, we had metrics such as
- The average number of goals added for every A/B test per account
- % of the accounts using a visual editor/code editor etc
- Reporting: For the reporting part, we had metrics such as
- % of accounts that have viewed different graphs in the reports.
- Management: For the management part, we had metrics such as
- % of accounts that have applied certain settings in the A/B test campaigns(traffic percentage, deleted a campaign, cloned a campaign, etc)
- Creation: For the creation part, we had metrics such as
This framework gave us a basic start and we could now start envisioning how our reports could look like. We later came up with a more sophisticated addition to our framework which was inspired by a post I read years ago.
- Adoption/Frequency scatter plot
- X-Axis: Feature adoption i.e Total number of accounts that have used a particular feature at least once. E.g. Total Accounts: 5000
- Accounts that have used feature A at least once: 3000
- Y-Axis: Frequency i.e Number of times that a feature was used. E.g.
- The total number of times that the feature was used by 3000 accounts.
This scatter plot would give us a good understand of the usage of our core features i.e
- Feature in the top right is the feature that is being mostly used by our customers.
- Feature in the bottom left is the least adopted feature.
More on this in another detailed post.
Finding the right tool
The second step was to find the right tool to see our framework in action. When we started our building analytics for our product we did not have a dedicated analytics function or the expertise to do it properly. We were just thinking from first principles and started building it up.
We could go with two options
- An analytics solution such as Mixpanel or Amplitude: We first tried using Mixpanel which is a wonderful tool but it had certain limitations (or at least we couldn’t figure it out back then).
- Being a SaaS business for us an ‘account‘ is more important than a ‘user‘ of that account. For example, it is more important to understand if an account (say Dominos) has created an A/B test vs knowing which user within dominos created that test. The user information was important for us to understand the power users but to understand the adoption we looked at information on the account level.
- We wanted a solution that could really give us all the customizations that we could do. We had some calculated metrics which required computations of events. We were also thinking from a future perspective as we might want more flexibility to build more sophisticated frameworks and its not easy to switch tools.
- The cost was a big consideration for us as it was an experimental project for us and we did not want to invest heavily in a product that would not work for us in the future.
- Lastly, we knew that we could get data from multiple sources now, and also in the future do we want a tool that could easily ingest data from our product database and other sources seamlessly.
- BI tool like PowerBI: Our second most obvious choice was to look for a BI tool that could give us all the flexibility and could attach various data sources easily. PowerBI stood out because it could do all that we needed and it cost us a $20/month license which was quite affordable for our experiment and was future proof as well.
Learnings
My biggest learning is that one should never run after a perfect solution. Keep iterating and learning. Now we have a dedicated analytics function and they know way more than we knew 5 years back but the principles and frameworks we used are still solid for us.
Bonus: Once you have properly collected all the product usage data you can use them a lot in go to market strategy. More on this in another post. Keep watching this space 🙂