<p>medium bookmark / Raindrop.io | A recurring issue (even among our customers) is that most of us simply suck at measuring (in tangible terms) the business improvements achieved as a result of UX testing. When I ask people (customers and non-customers alike) whether a round of UX testing yielded positive results, I get vagaries like: [&hellip;]</p>

Breakdown

medium bookmark / Raindrop.io |

Measuring spoons

A recurring issue (even among our customers) is that most of us simply suck at measuring (in tangible terms) the business improvements achieved as a result of UX testing.

When I ask people (customers and non-customers alike) whether a round of UX testing yielded positive results, I get vagaries like:

👍 Yeah! Everyone was really impressed by the issues we found!

😃 Testing was so quick and the quality of the feedback was surprisingly good.

🤔 What do you mean? Like… did traffic increase?

Being a hard-nosed, business-minded marketer, I’m never satisfied with such answers—and you shouldn’t be either. Cultural change is good. Insightful user feedback is good. Traffic is good.

Busy traffic on city street

…mostly

But all those things are the means—they are not the end. Executives care about the bottom line.

When I say “results”, I mean money or a direct proxy for money (e.g. software signups). I know the pure UX designers among you might turn your noses up at that.

😻 It’s like… about the experience, dude. We do it for the love, not the money, man.

That’s all well and good (and true). But I’ve seen first-hand that if you don’t show commercial returns to the business, it’s only a matter of time before investment in research and testing gets cut.

I’ve decided to share my process for running UX tests on our blog and getting tangible business results out of the insights gleaned. Tangible business results like:

Goal Completions, Feb-Aug

Once changes based on UX testing had been made, the numbers above fluctuated month-to-month—sometimes going a little higher and others a little lower.

But the overall trends remain the same, with free trial conversions moving into double figures (from lower, single digits) and all goal completions increasing dramatically.

Free Trial Completions Feb-Aug

I chose to benchmark against August because up until that point, my focus had been on creating high-quality content, which attracts relevant people (and keeps them coming back).

We’re a niche within a niche (remote UX testing, within user experience as a topic). Building a reputation and *relevant* audience (without paid acquisition) is no joke.

Again, without resorting to bad UX or spammy tactics, here’s the growth in traffic we managed between March 2016 (3 months after I’d joined WhatUsersDo) and September 2016:

Traffic Growth Sept-Mar

Once I’d created enough momentum traffic-wise, I began focussing more on making the most of that traffic, through the power of great UX.

Using my process as a template, I’m going to share:

All improvements shown here are as a result of changes based on UX testing alone. We spend exactly £0 on promotion of our content. We serve 0 pop-ups on the blog and don’t try to spam people into doing anything.

Setting tangible, measurable goals for UX testing

Simply “improving the user experience” is too vague. Every experience comprises multiple interactions and the key is to pick a specific interaction you’d like to improve.

Don’t overthink this part. For example, the interaction you choose could be as simple as increasing click-throughs to a page with a high conversion rate.

Arrow signs on road

Just think of the closest link between the thing you’re testing and revenue, then aim to improve the performance of that link by some percentage points.

For example, it’s damn difficult to get you to go directly from reading this blog to paying for the WhatUsersDo platform. A handful of people have done it in my time here, but it’s very unlikely.

However, people do go from reading the blog to:

Signing up for a free trial

Signing up for the newsletter

And some of the people who do those things become paying customers. So, these were the links I identified between the blog and the business—conversions

Here are some parameters to help you select effective goals for UX testing:

Now we know the metrics I wanted the insights from UX testing to help me improve—free trial signups, asset downloads and newsletter signups.

How did I design my UX tests to achieve those goals?

Writing UX testing tasks that will reveal obstacles to achieving your goals

So, if you haven’t read our 8 tips on writing incredible UX testing tasks, go do that right now. It will give you a good foundation for this discussion.

UX Test Tasks for Blog

Above is a screenshot of the tasks I set users in the first UX test I ran. My approach was simple:

As you can see from the screenshot of the UX testing tasks, users needed to:

➡️ Go from the homepage, to the blog, to a free trial signup

➡️  Go from the homepage, to the blog, to the newsletter signup

➡️  Go from the homepage, to the blog, to an educational content asset (downloadable or not)

My hypotheses were that of the users who enjoyed the content they found on our blog:

Upon running two rounds of UX testing, these hypotheses were proven accurate (along with some others that I hadn’t thought of).

The results of testing also revealed issues users falling into group A, B or C might face, while browsing the blog.

User wants to learn more before clicking free trial

For example, based on this user’s behaviour, I changed the menu items in the top nav.

The gentleman twice hovers over the option to view the free trial page but chooses not to do so—saying at one point, “I’m not gonna do that.”

So, I quelled my marketing thirst for leads and decided to replace “Success Stories” (salesy) with “What Is UX Testing” (educational), in the blog’s top nav.

Blog Homescreen

The result? A significant reduction in the number of people clicking the free trial link—but a significant increase in the number of people signing up for free trials.

I know. It’s counterintuitive. I still don’t fully understand why these two changes occurred together but I have my hypotheses. Perhaps more people felt informed enough to decide (definitively) whether or not UX testing is for them.

Another user validated my hypothesis that the sidebar on the blog was being wasted. It was occupied by stock widgets and redundant information, that neither educated users nor supported our business.

User goes to sidebar to find educational content

This user was not the only one to behave this way, so I knew I had to find a better way of solving her problem.

Old blog header

I added a combination of educational content with decent images, and downloadable assets to the sidebar. I also removed or compressed extraneous information.

New Blog Sidebar

As you can see, the tasks I set drove users down certain paths which made it necessary for them to reveal obstacles in the way of their goals (and mine).

For example, if your goal is to find out whether users can successfully complete “section X”, set a scenario you know will involve them passing through “section X”. Just don’t tell them what you know and don’t tell them what to do when they do arrive at “section X”.

Measuring the impact of changes based on UX testing (using Google Analytics)

In terms of analytics, there is no better expert I know of than Jill Quick—the grand sage of Google Analytics.

Before you read my super-condensed version of how to measure goals, read as much as you can on Jill’s website. Then pay for her consulting services. Honestly, you’ll thank me.

In terms of the UX tests I ran on our blog, there are 3 main ways I measured performance:

Google Analytics goals are parameters you can define to let Google know how to measure commercially important activities on your site (e.g. a sale or free trial signup).

Google Analytics will then be able to tell you how many times that activity has occurred, who completed the goals, where they came from etc.

You can find the area for setting up goals in Google Analytics by going to: Admin -> View -> Goals.

Google Analytics Goals

This is good for an overall picture. But I can’t see, off the bat, only people who completed said goals via the blog.

The goals reporting tells me, “This many people did X, coming from Y, using Z device…” etc. But it won’t automatically categorise people who did “X” for me.

What I really want to know is, “How many people did X, via the blog?”

To answer that, I need segments—think of segments as stencils you lay over your raw data, to reveal only people who fit within certain conditions.

The shape of my “stencil” was designed to show only people who completed goals via the blog, without also visiting pages for our case studies. This was to prevent duplicate reporting… remember, we’re focussing on the blog here but I also create other types of content for WhatUsersDo.

Google has written a short guide on how to create segments. Here’s what mine looks like (with arrows indicating customisation options and areas of interest):

Google Analytics Segments

We’ve covered segments, which help with slicing up my data… but how do I see all the information relating (specifically) to my blog UX testing project, in one place.

To do that, I need custom reports.

Think of a custom report as a filing cabinet that lets you group together data you care about, regarding a specific project. You can customise the item about which you’re seeing data, as well as the kinds of data you see about that item.

For example, you can choose to see only the “page views”, “time on page” and “bounce rates” for the “Resources” pages on your site, in one custom report.

Here’s what my custom report, which shows the performance of our blog, looks like:

Feb Custom Report

And here’s how I’ve set up the customisation in the back end (although, I’ve also added several filters which you can’t see in this image):

Custom Report Setup

And there you have it! Now you know how I:

How do you measure the improvements you get from UX testing within your organisation? Let us know in the comments, or start a conversation on twitter.

Get a newsletter that isn’t all about us…

Subscribe for weekly, hand-picked articles about UX, design, and more every Friday—from the Be Good to Your Users blog and the rest of the whole darn web.

…get a little taste right here. 👅 🍭

Timi is a London-based copywriter and full-time marketing sceptic – there are now more unvalidated opinions out there than ever.

He became a UX testing enthusiast after seeing its power while working at TUI – the world’s largest travel, leisure and tourism company. He then joined WhatUsersDo to sharpen his UX knowledge and work side-by-side with the field’s best and brightest.

Curated

Jun 13, 7:43 AM

Source

Tags

Tomorrow's news, today

AI-driven updates, curated by humans and hand-edited for the Prototypr community