“You ask a lot of questions.”

That’s what we heard recently from a caller trying to sell us something. The call ended with them hanging up on us (Ha!) and we thought, “Hmm, who wouldn’t want to work with someone who asks questions and is curious?” Not us.

One of the great things about working with our team is that they are always inquisitive, and always asking questions. We regularly host “Lunch ‘N Learns” to share information and new techniques. Just yesterday, one of our developers gave the team an update on user analytics and some of his recent learnings with a client’s product, and we thought we’d share. We’re nice like that.

What is user analytics?

User analytics tells the story of how a user interacts with your product and informs decision-makers by surfacing deeper insights and revealing the full picture of a user’s behavior. What it doesn’t do is report on the number of bugs or errors the app has or focus on stats common in business analytics. Think of this in terms of how the user navigates through the app.

By capturing and analyzing the user data, product teams are able to:

  • Understand user behaviors – where do they come from, what features do they use, why do they come back and more importantly, why do they drop off?
  • Track the impact of new releases and experiments
  • Help teams build products users love

With the data and insights collected, design, development and marketing teams are able to ask increasingly important and in-depth questions and learn more about the user’s journey, ultimately leading to a better product and user experience. (And don’t we all want that!)

Up until recently, little has been done with this data except use it to address user complaints or to provide incremental improvements to the user experience. Data should be used to inform decisions and support customer growth efforts to create engaging, personalized user experiences that will keep users in the app for as long as possible. The more friction a customer experiences on the website or when placing an order, the more likely the customer is to go to a competitor’s site. Applying user analytics can increase the chances of success and keep the customer from roaming.

For example, high bounce rates could signify users are being frustrated by the initial screens they see. Being able to pinpoint just what these issues are (and when they happen) would certainly be helpful in retaining users. It reveals vital information that ranges from session time to bounce rates as well as geo-location factors and the specific activities that your users did while in the app.

While we agree that the analytics are helpful and necessary, when do you introduce them? We believe that this process should be implemented early in the MVP process, even if it’s only a few metrics. The earlier the better…this helps define the roadmap and the end goals, not only for the designers and developers but for the client as well. If these goals are not outlined in advance, make it a priority to work with the client to nail these down. It just makes sense.

One thing we learned early on in our discussions with clients is that they would love to add as many data points as possible into the dashboard. “Capture it all” was a common mantra – the more the better. In reality, while it’s always good to capture metrics, sometimes “less is more” and just because you can capture it, doesn’t necessarily mean you should, especially all at once. Too many metrics in the beginning can become confusing and can distract from and water down the end goal. Educating the client on what metrics make the most sense is always a tricky line to walk, but a worthwhile one nonetheless.

There are many different user analytics systems out there – we’ve researched most of them and worked with many. Sometimes it comes down to developer choice or the specific goals funnel the solutions down to a few. Other times the client expects to use a specific system because it’s what they are familiar with. While we are flexible in our approach, we do have our favorites of course but are not tied to one specific brand. It is determined by the end goal, the resources available, the budget and any time constraints.

With that in mind, there are two key elements of analytics that should always be defined and tracked: the “Critical Event” and the “Average Usage Interval”.  According to Amplitude (a first-in-class analytics platform), these are defined as:

  • Critical Event: An action that users take with-in your product that aligns closely with your core value proposition. This may be an in-app purchase, membership subscription, or other key action.*
  • Average Usage Interval: The frequency (daily, weekly, monthly, etc.) with which you expect people to use your product. This can be estimated at first, but proper analytics implementations will measure and surface this number over time.

The Critical Event and Average Usage Interval form the cornerstone of any user analytics implementation. Other measurements like user retention, Critical Event funnel progression, and user life-cycles rely on these two primary metrics. Such secondary metrics and measurement can be extremely helpful in decision making, but a house is only as solid as its foundation. Woah, wise words from our development team.

And lastly, stop measuring Daily Active Users (DAU)**. It’s a vanity metric and, while it may look great when a million people land on your product, it cannot inform you of trends when those users stop coming back.

In summary, incorporating user analytics is good practice and provides a useful barometer for gauging the success of the app. Adding metrics early is key and keeping the number of metrics to a realistic number makes the data more digestible and focused. In today’s competitive environment, understanding the user’s journey is essential. A good experience keeps the customers coming back.

We’d love to chat with you about your experiences with user analytics. Contact us today.

Credit where credit is due. Thanks to Amplitude for their great blog posts:
*User Retention Depends on Your App’s Critical Event
**You’re Measuring Daily Active Users Wrong