Why You Should Get Comfortable with Iterative A/B Testing

Could iteration be the solution to your A/B testing woes?

October 4, 2018

Topic tags

Andrew Capland

Marketing


It wasn’t all that long ago that A/B testing was a task reserved for the most technical and analytical of marketers. Within the past five years, however, A/B testing has practically exploded in the marketing world. But don’t just take my word for it — when looking at A/B testing-related searches using Google Trends, you can see the dramatic and steady search volume increase starting back in 2013:

As an analytical marketing nerd myself (not to mention one who’s done a ton of A/B testing), it makes me really excited to see the industry get on board with this trend. Marketers want to know whether or not the campaigns they’re running are successful and if their work has a serious impact on key company goals — and they use A/B to help answer those questions.

While it appears that marketers are investing in A/B testing more than ever before, there is one aspect they fail to focus on time and time again, and that’s iteration. When it comes to setting up an effective A/B test, iteration is the crucial last step that often gets skipped.

In this post, I’ll walk you through how the A/B testing landscape is changing, how you can put a greater focus on the iteration component of your process, and ultimately, how we’ve been able to see success with an iterative mindset here at Wistia. Let’s jump in!

A/B testing is fundamentally changing

In order to get a better understanding of where A/B testing stands today, let’s look back at how marketers have been conducting these tests historically.

Conversion-minded marketers were some of the first to run A/B test on landing page designs. From using different language on the page and changing up the CTA copy, to experimenting with form location, the goal was simple: to improve conversion rates.

Here at Wistia, for example, we ran a test to see if the design of a landing page could impact sign-ups. After testing some of our landing pages, we found that the design did affect sign-up rates. So, we spent a lot of time thinking about how we could apply similar tests to pages with even higher impact. We then began testing different designs on our website on some really important pages. Here’s an example of two product page tests we ran back in 2016:

Today, however, we’re finding that making small tweaks and comparing results only gets you so far. If you really want to move the needle on a goal, you need to take bigger risks with the tests you run. In 2018, marketers (including us) are no longer making minor tweaks to one variable on a page — instead, we’re experimenting with the overall user experience by running multivariate tests. Why, you ask? Well, like I said earlier, if you want big results, you have to think bigger with the tests you want to run. And that’s where the iterative part of this puzzle comes into play.

Iteration becomes the new focus

Putting process behind your testing efforts can lead to serious business impact — and you don’t want to leave iteration out of that process. The image below outlines all of the steps required to run a successful A/B test. (Plus, it includes my Bitmoji, which is essential.)

Keep in mind that when you make a big multivariate experiment, it’s pretty rare that you’re going to get everything perfectly right. So, plan from the beginning to collect some learnings and then iterate. Even if it’s not perfect, this process is effective. I believe that most marketers aren’t iterating enough. Why? Because iterating isn’t as fun as testing.

“Keep in mind that when you make a big multivariate experiment, it’s pretty rare that you’re going to get everything perfectly right.”

Some of our most valuable A/B tests here at Wistia provided learnings that led to future iterations, which ultimately made our projects that much more successful. Let’s take a look at how iteration helped us run one of our most successful A/B tests to date.

Running a successful, iterative A/B test

Last summer, our team was hyper-focused on increasing our new user activation rates. We were running tons of A/B tests on the website, in the product, and through email — all with the same goal of getting more new users.

During a brainstorm, a member of our team wondered if we could try getting new users to start using our product before they ever created an account. Essentially, they pitched a “try it before you buy it” experience.

We were all intrigued by the idea, and after some input and debate, we settled on the idea of creating a “lite” version of the Wistia product on the website.

Testing a totally different approach

For this A/B test, we would allow users to experience our most popular features — like changing player colors, adding custom forms, and password protecting your video — before we prompted you to create your own account.

The control version of the page looked like this:

And the new experience looked like this:

Though it’s hard to tell from this screenshot, we created an entirely interactive experience for our website visitors by letting them play with the core functionality of our product right there on the page — no credit card required. This was not a typical A/B test.

Measuring and iterating on the test

After launching the new experience, we kept an eye on the activation rates of new users that created accounts from this page. A week into the experiment, our numbers hadn’t changed.

We wanted to know why that was the case, so we decided to get some user input to help paint a better picture. I’ve mentioned this before, but understanding how users interact with your tests is crucial to your success. Our whole team crammed into a room during lunch and watched 30 users interact with the new page experience.

What we found, was that many of the users were not even active participants in the experience. Since we removed all of the other content on the page to minimize distractions, it was clear our users didn’t understand our intent, and likely thought the page was just an image.

This was maddening and exciting at the same time.

Based on what we learned, we decided to iterate on the design and actually highlight how users could interact with the page. Almost immediately, our activation rates started to increase. We decided to make certain text larger and add subtle highlights to improve the UX to ensure users saw the message.

Moving forward with our results

We finally felt like we were making some progress, so we decided to go ahead and do more user testing with the hopes of iterating even further. We asked participants, "What would stop you from creating an account right now?” and learned that many of them wanted additional context about the product. Yes, they liked interacting with it the product, but they still wanted to read about the additional functionality.

“Yes, they liked interacting with it the product, but they still wanted to read about the additional functionality.”

When we made the page entirely interactive, we thought we were removing friction and eliminating distractions. But, we learned that we were actually preventing our users from gleaning valuable information that was necessary in order for them to make a decision. So, we added our product information below the interactive player and watched our activation rates soar. After one final round of user testing, we decided to add a link to our pricing page.

In total, we made five iterations of our original experiment. Yep, five.

If we hadn’t iterated, we could’ve mistakenly moved on to the next project — calling this one a dud. Instead, iterating allowed us to help our users better understand our product and increased activation rates even more than we had originally anticipated.

Adopting an iterative A/B testing mentality

Here at Wistia, we’ve baked iteration into our core A/B testing process. We’ve adjusted our testing to make feedback and iteration part of the process right from the start, and this is especially helpful when we’re testing video.

Video, by nature, has a ton of variables, which means there’s really is no such thing as a running a “pure” A/B test on your video. That’s why you want to ensure that you leave room to collect feedback — and then use that feedback to react and improve the experience — with the goal of increasing your chances of helping your users along their journeys.

While it might not seem glamorous, putting process behind your testing efforts (and leaving room for iteration) can lead to some serious business impact. As marketers, we’ve grown tired of making big marketing decisions based on a simple hunch — and baking the iterative process into your A/B testing efforts will help you make more informed decisions about what bets you should be taking. Brainstorm, test, and iterate today and start improving those conversion rates!

Andrew Capland

Marketing

Mailing list sign-up form

Sign up for Wistia’s best & freshest content.

More of a social being? We’re also on Instagram and Twitter.