How Experience Analytics improve UX Design (Re: Lesley Harrison’s article on CMSWire)

How Experience Analytics improve UX Design (Re: Lesley Harrison’s article on CMSWire)

Just a few weeks ago, Lesley Harrison wrote a great article on CMS Wire about how UX Design Customer Metrics can help you improve customer experiences

User experience (UX) can have a big impact on customer satisfaction. Learn which UX design customer metrics you should keep an eye on.

Lesley Harrison
Technical Writer and Open Source Enthusiast

That’s great but …

But how do you actually measure and understand some of the things that Lesley described? How can you do it at scale? And how can you enable everyone on the team to use these metrics?

Allow me to tag on to Lesley’s 5 areas of study that she urged in her article. I am reusing the exact 5 headings from her article here and adding my two cents to Lesley’s well-taken advice.

1. Start With Real User Monitoring

What Lesley means by Real User Monitoring is information about “how customers use your products. By tracking real user activity, you’ll gain insights into installs, sessions, heatmaps, user paths, etc.”

Perfect start. What should we add though?

Where I sit in the world of metrics, Real User Monitoring (RUM) typically refers to a type of speed analysis, i.e. web performance monitoring.

  • Does the speed and responsiveness of your digital experience impact your customer satisfaction and success? 200% yes it does!
  • So then, how can you monitor speed? That’s the realm of Speed Analysis.  
  • Some of the most famous speed analysis metrics are now called the Core Web Vitals because beyond just measuring experiences, they now also impact our SEO rankings with Google. 

Say hello and goodbye to errors. Doh

  • Yup, of course, you gotta scan your experiences for bugs and errors. There are always many of those on any site or app, but which ones are the blockers that actually stop your users in their tracks? 
  • That’s what Error Analysis will tell you when ranked by the impact on journey completions, e.g. sales conversions. 

Extinguish UX frustrations

As a UX Designer, do you feel like errors and slow speeds are somebody else’s job? Even if that may be true in your case, a little-known fact is that most frustrations and abandonment on your site and app are often due to points of friction that have nothing to do with errors and slow speeds.

  • These are the pesky UX issues where everything works as designed, and most users complete their journeys. But some of your users are still confused, e.g. why the submit button on your form remains grayed out even though they believe they’ve completed all the form fields. Or, how they are meant to navigate and complete a journey. 
  • These pockets of users that run into frustrations and give up are not something any business can afford these days. (OK, maybe Amazon can.) 
  • So, frustration scoring cannot be missing from UX Design metrics.
  • How do you do it? Thankfully, these days, frustration scoring can even be automated. See, for example, how frustration detection works with Contentsquare and with Hotjar

Journeys, goal completions

  • Paths are great but you don’t really care about paths. What you really care about is whether your users are able to complete their goals (ahm, …. and yours) easily i.e. their desired journeys vs actual journeys
  • Perfect opportunity for comparing journeys side by side, especially those that complete successfully vs. those that go in loops and eventually get abandoned.

Business Impact

  • UX metrics are great but since we don’t have an infinite amount of time to invest in UX improvements, our users will benefit the fastest if we can address the issues first that are giving our users the biggest headaches and heartaches. 
  • So that’s the role of Impact Quantification in UX Design metrics.  

2. Track Engagement Over Time

As Lesley wrote: “Acquiring customers or users via a marketing campaign is one thing, but keeping the attention of those users is a different story.” She recommends: “Take a broad view of how your customers interact with your brand. Do they like and comment on your social media? Do they share your content? If you have a loyalty program, are they actively participating in it?”

That’s great but leaves so much more to add for us based on Experience Analytics, right?

Why are they bouncing?

  • Once acquired traffic arrives on your landing page, what behaviors within that landing page distinguish those who will bounce right away, vs. those that enter your “digital store” to consider your offerings?  
  • A “pageview” metric tells you nothing to answer that Why. This is where we need those heatmaps that Lesley mentioned earlier. They quickly pinpoint the golden paths vs. what leads to disengagement.   
  • For example, when we compare bounces vs. deeper journeys on a scrolling heatmap side-by-side (left vs. right in the image here) we see that most that bounced didn’t even scroll down on the landing page. The non-bounced users on the right show a much higher Exposure Rate metric for content elements under the fold. 

But what is it about scrolling that makes a difference on whether you are able to keep the user engaged?

What’s engaging them vs. not?

  • That’s where UX Designers can draw on metrics such as the Attractiveness Rate. This very useful engagement metric shows which content elements attract users attention to tap on them — IF — they scroll far enough to see them.
  • For example, in the image below we see some content elements highlighted in red color that are very attractive for clicks. 
  • But we know from the scrolling heatmap above that most users aren’t scrolling far enough to see them.  
  • So, we need to help more users find these engaging elements

What’s persuading them?

  • But … ultimately you don’t care about clicks. You care about journey completions. Plus, a user may only make 1 click on a page to go to the next page, but they will visually consume many content elements. So which content elements should get the credit for persuading the user to make that next click instead of abandoning the site or app?
  • That’s where the “money metrics” come in. For example, one of these is “Conversion Rate per Hover”.  IF … users hovered over a content element how likely does it make for their journey eventually to continue to a conversion? Even if they don’t ever click on that content element?
  • In the example below, we see that two banners under the fold are extremely well coordinated with conversions if users scrolled down and hovered over these elements. They are not as likely to click on them but these content elements are very persuasive.

Now, we’re in the realm of persuasion which these days is the real trick. How to persuade our users to spend their very limited disposable time and money on our site or app. The million-dollar question, literally

3. Gain Measurable Data With CSAT Scores

As Lesley writes, “The higher the percentage of satisfied customers, the better.”  Great. but how do we find out why some customers aren’t satisfied? And how do we know how much that impacts our overall experience and revenue?

Lesley didn’t say really. But … experience analytics can tell us. 

Session replay, integrated with Voice of Customer feedback

By integrating session replay with your customer feedback tool, you can see exactly what steps led up to issues and complaints. See how session replay works for example with Contentsquare and with Hotjar

Compare journeys with 5 vs. 1 stars

What journeys and areas of your site and app are driving the most feedback? Journey Analysis will tell you at a glance when integrated with feedback scores.

Impact Quantification is the key for moving the needle

There is always a lot of feedback but you can’t possible act on everything. So, what feedbacks are just outliers vs. the issues that really matter? We’re back to Impact Quantification here as perhaps the most important UX Design Customer metric of all. So you can prioritize.

Example:  Contentsquare Session Replay integrated directly in the Qualtrics Customer Feedback Management solution.

4. Monitor Usability Metrics for Greater Insight

Lesley wisely says:

By measuring how easy it is for users to access specific features or complete key tasks using your app or website, you’ll better understand how it performs in the real world. If completion rates are low or it takes longer than expected for people to perform certain tasks, this could explain why people abandon the app.

Lesley Harrison
Technical Writer and Open Source Enthusiast

Great but how do we do this?

Of course, Experience Analytics!

Besides scanning for errors and frustrations as we’ve already covered, the gift that keeps on giving is to compare successful vs. abandoned customer journeys side by side.

In the example above we see successful journeys on the left and incomplete journeys on the right. The Sunburst visual is easy to read:

  • Different colors represent different page types
  • Black represents site/app exits
  • In this case, the innermost ring shows the starting pages for journeys.
  • Every additional ring is the next step in the interaction.

In this example, it jumps out easily that abandoned sessions are more likely to be going in loops between various page types.  Session Replay then answers why it is happening and what’s the underlying root cause.

See for example the following real-life case study on how this helped detect and seize opportunities for enhancing UX Design at the financial institution Leeds Building Society.

5. Compare Customer Adoption and Retention

As Lesley outlines: “Poor retention rates could be a sign of a problem, either in terms of how the product performs or how it’s marketed.:” and “Poor adoption could mean you’re not doing a good job of turning existing customers into brand advocates or that there’s room to expand your existing marketing campaigns.”

True but how do you get to the why?

Experience Analytics, of course!

Using everything and all of the above that we’ve already touched on, all the way down to granular session replays that show anecdotal examples of what caused individual users to run into a wall. And what were the underlying root causes, e.g. API or JavaScript errors, site errors, etc. 

Bonus Tip: Customer Needs and Responses Are Always Evolving

Lesley’s article continues to talk about how all of the above can vary across devices, OS, bandwidth, etc .etc.

All the more reason for Experience Analytics. Because you can’t exhaustively usability test every combination to detect every leak. But experience analytics does collect data exhaustively so you can detect and quantify every pocket of customers that are less than perfectly served by your UX Design.

And then you can break this down to surface the technical profiles that are most impacted by an issue. So you can pinpoint what needs to get fixed for whom. 


The moral of the story

Lesley’s article is showing the direction. Customer Experience Analytics enables you to move in that direction. Experience Analytics cannot be missing in your UX tool bag of metrics.

Leave a Reply

Your email address will not be published. Required fields are marked *