Successful onboarding creates more active users, and more active users means more sales and revenue for the business. So recently, we’ve been hyper-focused on increasing our new user activation rate by making Wistia’s onboarding flow more effective. Video is a big part of that strategy.
In past blog posts, we’ve talked about why we like using video to onboard new customers. We like it for a few reasons. We like that it humanizes a very impersonal digital experience. We like that it accelerates learning and comprehension. And, most importantly, we like that it works.
But, like many of you, we don’t always know the best videos to show to our prospects at key moments of the user journey. That’s why we test lots of different ideas and let the data guide us towards the right answers. Sometimes, it takes a few tries to get things right, so we run a lot of tests.
“We test lots of different ideas and let the data guide us towards the right answers.”
Recently, we ran a series of video a/b tests in our onboarding flow, and we learned some useful lessons that we couldn’t wait to share with our audience.
Let’s start at the beginning of our favorite test so far...
After completing our signup form and creating a Wistia account, our new users are shown a welcome screen with two options to get started:
- Upload their video
- Borrow a loaner video from us
Either option brings new users into their account, so they can start exploring all that Wistia has to offer. Though, if we’re being honest, we’d much rather have them upload their own videos to get up and running right away.
But, we realized that some people might not have a video file handy. We wanted them to be able to see Wistia’s features in action, so we created a loaner video that they could use. Exploring Wistia’s features with our loaner video is like viewing an open house with staged furniture. New users can begin to visualize what the place could look like with their own stuff.
They can match the video player color to their company’s branding, add video marketing tools (Turnstile email collectors, calls to action, and annotation links), and enable captions and social sharing options.
We assumed the loaner video content didn’t really matter, since it’s job was to allow folks to explore how the product works. It was just a vehicle (or so we thought).
About 20% of our new signups borrow a video during their onboarding. Here’s what their Wistia media page used to look like after borrowing a video from us:
The actual video they used to borrow was a cute video of Lenny (our office dog and unofficial mascot) running around with a VCR tape in his mouth. Everyone loves cute. Everyone loves Lenny. So that’s the video we chose.
We’ve embedded it below so you can watch for yourself.
We figured adding a loaner video would allow more new users to explore our customization features, but we didn’t expect so many people to actually play and watch the video.
We use a tool called FullStory to watch users interact with our software. It’s a useful way we gather feedback. We get to see areas where they get stuck, weird bugs they find, confusing user flows, etc. Watching users and taking notes is a great way to get information that people typically don’t share in written surveys.
While watching some Fullstory user sessions, we noticed our users were playing the loaner video. A lot, actually. We checked the video’s engagement data and discovered new users not only played the video, they watched almost the entire thing!
We had a lightbulb moment. There was a huge opportunity to increase our activation rate on the media page. It was sitting right in front of us.
“There was a huge opportunity to increase our activation rate on the media page.”
Our hypothesis was that using a product-related loaner video instead of a Lenny video would increase the percentage of users that took action on the media page, but we weren’t sure what that video should be.
So, we grabbed our video team and brainstormed a bunch of different video concepts we could use for this test.
The brainstorm ideas included:
- "How to" tutorial
- Product overview
- Personalized welcome message
- Account details walkthrough
- Motivational content
- Data-driven content
We picked the idea we thought would have the most impact—a "how to" video focused on our media page. Since our new users would be watching the video with the Customization panel open, we thought this video concept would increase engagement the most.
The video team created a script the next day. Two days later, they created a sweet new video for us to test. It was fun and informational, and it ran through the most common customization features in less than 90 seconds.
We were really excited to see how effective the new video would be. Our engineering team helped us divide our new signups into a 50/50 split. 50% of our new users that borrowed a video received the Lenny video, and the other 50% received the "how to" video above.
Our primary success metric was the percentage of new users that used the customization features.
We wanted to make sure our results were statistically valid, so we launched the test, then kicked back and waited a few weeks for the data to roll in. We tried not to peek. Not peeking is hard! But we didn’t want to get excited too soon.
After a month, we pulled the data. Our new video had increased our main metric by 30%. Holy smokes. We were thrilled. We ran the numbers through a statistical significance calculator and shared high-fives after we saw the numbers were valid.
“Our new video had increased our main metric by 30%. Holy smokes.”
30%. Our team was pumped.
Creating a winning test is great. But understanding why it was successful is even better. When you understand why, you can do more of what works. So we thought about why this test was so successful, and we realized a few things:
- There is no substitute for getting close to your users and understanding their pain points.
- Never assume what you have today is finished.
- Teaching users how to do seemingly straightforward things may seem boring, but it can be extremely effective.
- Sometimes context informs content.
We have written about the value of a/b testing with video before. What do you think? Are you testing with video? We would love to hear your stories in the comments below!