Why You Should Test Your Next Marketing Video
February 4, 2016
Testing your marketing videos empowers you to make better decisions and unlock conversion potential. Allow me to explain.
Picture this scenario: you recently published a new product video on your website. You spent three weeks designing, producing, and editing it. Your creative team crushed it. It holds your attention, it clearly explains your feature differentiation, and it’s fun.
Awesome! Some marketing collateral that shows off your brand’s personality. But suddenly, your signup rate plunges. Shoot. Is this a coincidence, or was it the video?
Video is a powerful conversion tool. It humanizes your brand, and it compels your viewers to take action. But, publishing new videos on your site without understanding their impact is not the way to go. You might accidentally shrink the numbers you meant to grow. Even worse — you might not even know it.
“Publishing new videos on your site without understanding their impact is not the way to go.”
You wouldn’t make a major update to a page on your website without testing its impact on conversion, would you? I hope you’re shaking your head no.
Similarly, you shouldn’t publish an important video on your website without testing it. You need to A/B test your videos. Especially videos on pages with lots of traffic.
You will be wrong (and you should expect it)
As long as you’re testing…
Here’s a real-life example of something that happened a few weeks ago at Wistia:
We set a goal to increase the number of new users who upload a video during our onboarding flow. Video is a big part of our user onboarding strategy, so naturally, we designed a new video to help accomplish this.
After putting a lot of thought into the video design, we wanted to know whether it made a positive impact on our upload rate. We decided to test the new video against the previous version.
Boy, were we wrong. Although some of our assumptions were correct, the test outcome was not what we expected. Every single metric we were tracking went down. Our video experiment failed. By a lot.
But hey, at least we knew.
If we had implemented the new video without testing it first, that decision would have cost the business thousands of dollars in lost revenue in the coming months. Not a good move for any internet marketer interested in career growth. You’ll be glad that you tested your assumptions instead of blindly making decisions that hurt the business (and potentially your career).
If you’re someone who makes lots of gut decisions that are never wrong, we’d love to have you join the marketing team at Wistia! Kidding. Kidding. It probably means you’re not being aggressive enough and should take more chances.
Develop a system to find the right answers
That system is testing.
If you have data from your videos, you’d be crazy not to be testing. Imagine all the opportunities you might miss, and all the wrong decisions you could avoid.
“If you have data from your videos, you’d be crazy not to be testing.”
To start, identify your main success metric, then write down a few ideas that could help you improve it.
1. Choose a success metric. In the onboarding video test I mentioned above, our main success metric was how many new users uploaded a video during onboarding. 2. Brainstorm a bunch of good ideas. We started our testing process by brainstorming ways to use video to increase that number. 3. Rank your ideas. Our team uses an ICE Score to categorize our ideas, which was made popular by Sean Ellis of growthhackers.com, but you can prioritize in other ways if you prefer.
Here’s how Sean describes the ICE Score:
"Score every new idea based on its potential Impact, your Confidence that it will be successful, and how Easy it is to implement. In other words, an ICE score. Once the ideas are scored, it’s easy to sort them and find the best ideas in the areas to prioritize first."
Now, you’re ready to cherry-pick the best tests to improve your main metric.
Test and measure (rigorously)
Make sure you’re making the right decisions based on data, not your gut.
There’s nothing worse than spending a lot of time and effort on a new video that hurts your important metrics… except maybe doing so over and over again, without knowing. That’s why you need to be sure to test and analyze your data.
Use a basic experiment framework to guide you
In the paragraphs above, we talked about a previous video test that failed. It’s a bummer, but it happens all the time. Here’s the TLDR: the first video that we tried was a longer, more straightforward overview of Wistia’s feature set. The second video that we tested it against was shorter and punchier. The shorter video failed the test.
We still thought we could increase our upload rate, so we decided to go back to the drawing board to try a different approach.
Develop a new hypothesis from previous test data
We hypothesized that we could increase our new user upload rate by editing the original, longer demo video (our control). We believed making two simple changes would make a big impact:
- Changing the video thumbnail would increase our play rate.
- Shortening the video would increase the video’s engagement rate.
Our new users are excited to use Wistia. It turns out that they’re even more likely to use it when we show them its core value right before they sign in. We wanted to highlight that, and decrease the amount of time it took to deliver Wistia’s value propositions.
We were confident that making these 2 changes would be likely to produce a positive lift in our upload rate.
In case you’re curious, our original onboarding video (the control) is on the left, and the variant is on the right:|
Win (with test data)
With the right system, rigorous testing, and helpful data, you will find wins (and avoid catastrophes). You will be wrong more than you’ll be right (and you’re expecting that). But by gathering data, you’ll be making the best decisions in one of your most impactful channels — video.
“You will be wrong more than you’ll be right (and you’re expecting that).”
Here are the actual results from the test we just ran.
The data shows that our upload rate increased. Go team! Though it wasn’t a dramatic increase, it was enough for us to implement the new video. The best part was, we could be confident in our decision-making, because the data was behind us.
Video testing for everyone
If you’re not testing your videos, you’re wasting one of your most influential communication tools and potentially making bad decisions that could hurt your business. If you’re new at testing, don’t be afraid. Start by identifying your main metric. Then, brainstorm video ideas that can increase that metric. Organize the ideas to optimize your time. Implement, test, and measure.
Now go high five the person sitting next to you. Success is a few tests away.