CRO Testing Methods Beyond A/B Testing

Reshaping the CRO Testing Conversation

If I say CRO testing, what is the first thing that pops into your head? A/B testing, right? That might even be the only thing that comes to mind, but there is so much more to conversion rate optimization than A/B testing.

Conversion rate optimization is the process of increasing the percentage of visitors to your site who complete a desired action. How you do that can vary. If A/B testing is the best path to that achieving that, then great, but it isn’t the only way; anything you do to improve your conversion rate can be considered conversion rate optimization. One size does not fit all when it comes to CRO testing.

The A/B Testing Conundrum

When UpBuild has CRO conversations with the companies we work with, the conversation tends to focus on A/B testing, at least from the client’s perspective. It makes sense: A/B testing is the cornerstone of CRO in most people’s minds. It can also be a great method for ensuring your changes won’t cause harm to your conversion rate and, hopefully, improve it, but only if your situation makes A/B testing viable. Ultimately, though, clients really just want more conversions; A/B tests are just one path to that goal.

So, why might A/B testing is not always be a viable or useful solution for many companies trying to increase their conversions? The first, and most common, situation we run into is many sites don’t get enough traffic to conduct a test in a reasonable amount of time. For example, according to Visual Website Optimizer, if we want to run an A/B test with 2 variations (control and one variant), and we have an average of 25,000 monthly visitors, a conversion rate of 5% and a MDE (minimum detectable effect) of 5%, then we will need to run our A/B test for 34 weeks. THIRTY-FOUR WEEKS.

Setting up an A/B test in VWO.

 

That is nearly nine months for a single test. Clients aren’t usually thrilled about running a single test for nine months. They need results faster, which is understandable. But A/B testing needs to be a science, and if we can’t get a scientifically significant result, then should we be running the tests at all? In some cases, yes, you can get good results even running tests for shorter periods than recommended, but in other cases, your data could be faulty. If you’re not following the guidelines for correct A/B testing, you’re basing your decision on whatever data you have, and faith. If you’re going with your gut anyway, what’s the point of running a test at all?

Many companies also want to run multiple A/B tests at one time, which presents a problem because then we have pages where one A/B test could be affecting another A/B test’s results. For example, if we have one page where we’re running a test with a new headline and another test with a button color change, what if a user sees the headline with the new button color? How do we know whether the button color the new headline affected their choice?

There is also a cost to testing, besides the tools you’re paying for: the cost of not making changes to your site. In this specific instance, you’re potentially losing money for 9 months while you wait for your test to finish. Even if you run the test for 9 months, there is a chance that the variant you were hoping would perform better doesn’t perform better at all, and now you’re back at square one.

Finally, we see attempts to run A/B tests while other factors on the site are changing, like a redesign or rebranding. It’s very hard to get a conclusive result when you’re testing something while other elements on the page are constantly changing. How do we know what elements affected that test result?

While there can be a lot of problems with A/B testing, it can be the best way to ensure your changes don’t do more harm than your current versions of the site. However, if you fall into one of the situations laid out above, then here are some other things you can do to improve your conversion rate.

Common Sense Updates

There are often issues with a website conversion funnel that are negatively impacting conversions, but don’t require lengthy testing to confirm that. These are things that your designer or UX consultant would hopefully point out when you are first building or redesigning a site. They’re best practices.

For example, if a given page’s main conversion point is buried on the page or is too hard to find, we don’t necessarily need to set up an A/B test to prove that users can’t find where they’re supposed to convert. We probably have a low conversion rate and a low number of goal completions to support our hunch — plus, we have the common sense and experience to conclude that we need to do a better job of highlighting our conversion point.

Let’s take another really common scenario that most people instantly run to A/B testing to solve: button color. Button color is the classic example of a change people might A/B test. It’s easy to test, and nearly every website has a button somewhere that can be tested. But do you always need to run a costly (in both time and resources) to know that your blue button on a blue page might not perform as well as a button that is any other color? Probably not.

It should be noted that common sense updates come with a lot of risk. You could severely hurt your conversion rate if you make the wrong choice. Remember, we’re working with very little data here specific to this change, so we’re going off of gut instinct and best practices, which could be wrong for your site. We wouldn’t recommend making common sense changes with major conversion points unless it’s relatively easy to change back in case the change isn’t effective.

So, if you don’t have the traffic to support a legitimate A/B test, then use your common sense, plus some research into conversion funnel best practices. You can’t use common sense for every CRO task, but there are plenty of updates you can make that don’t require a test.

Use Your Analytics Resources to Help You Make Decisions without Testing

Google Analytics has lots of reports that can give you insights into how your various conversion funnels are doing. Leveraging this data can help you make decisions that don’t require testing.

For example, let’s say we have a “Request a Demo” conversion point that is woefully underperforming; let’s say its conversion rate is .01%. Do we need to run a test comparing an updated conversion point to the original one? Sure, the new one could perform worse than our original, but we can probably update it and not expect worse results since our original results were so bad. If if it does perform worse, it won’t take us 34 weeks to see that.

Look for sharp drops in performance when updating elements — there’s your test data.

The low conversion rate of this conversion point isn’t the only thing you can take away from this. With such a low performing conversion, there are probably other problems with the entire conversion funnel. Maybe we set incorrect expectations with a previous page in the funnel, or we’re driving the wrong types of visitors to this page in the first place. This is also a conversion rate optimization opportunity that doesn’t require A/B testing, necessarily. We should not only focus on the final conversion goal, but look back through all the steps in the process. Remember, conversions don’t operate in a vacuum, for the most part— they can be affected by many other factors.

A/B tests tend to focus on the last step in the conversion funnel, but what about the multiple steps it takes to get there? You can use the Behavior Flow report in Google Analytics to see if users are falling out of the conversion funnel before even having a chance to convert.

Imagine: you run an A/B test on your goal, which is the final step in the funnel, but even with testing, you don’t see an improvement because there is a flaw in an earlier step in the conversion funnel.

Pay attention to previous steps in your funnel. A/B testing the final step might not produce results if your previous steps are flawed.

This is when it’s time to take a step back from the idea that A/B testing is the only CRO tool you have and look at the entire picture. Maybe you’ll decide the previous steps in your funnel are the problem, rather than the final conversion. This could lead you to run a different kind of CRO test (like a URL Redirect test to see if a different page in the funnel provides better results), or to notice something glaring, like a broken link or uninspiring content that wouldn’t require testing to fix.

Leveraging UX Experts and Consultants

In some cases, it makes sense to take CRO testing off your plate and pass it onto an expert. A UX or CRO expert will be able to guide you through the process of optimizing your conversion funnel. They will be able to take your data, decide on the best path to reach your goals and then implement it. A/B testing could certainly be something they recommend, but it will probably be just one piece of the puzzle.

Get Direct Feedback From Users

Try asking your users what their pain points are with your site. Collecting direct feedback can either remove the need to test or give you a better idea of what to test.

An often-neglected method of conversion rate optimization is directly asking your users for feedback on their interactions. This can be done in a few ways, including user session recordings where you watch a user use your website and then can get some takeaways from how they interact with it. Another way is to use on-page surveys. You’ve probably all seen an on-page survey: they tend to pop up in the middle or a corner of the screen and ask you a question about your experience. This is another tool you have in your CRO toolbox to get some insight into how to improve your conversion rate. It takes some clever question structuring and timing of your survey, but this is a great way to get direct feedback from actual users. You might not even realize you have a problem until you ask.

As with common sense updates, there are pitfalls with asking users for direct feedback. Often time users might not know what they want or what would have changed their minds so you’re feedback could be useless or even harmful. You’d need to collect quite a bit of feedback and seem some very common threads in people’s suggestions before you could make any kind of change with confidence. So, again, don’t rely on this method for changes to critical conversions on the site.

Where to go from here

Let me restate, A/B testing can be incredibly useful for the right situation. It helps you reduce the chances that the changes you make to your site are going to harm your conversion rate, while giving you data to support positive changes. A/B testing should definitely be ONE of the tools you use when conducting CRO initiatives, but don’t forget to leverage your analytics tools, experience, knowledge, and common UX sense. Your CRO testing initiatives should be a mix of various tactics, tools, and strategies that are are best for your particular goal.

Written by

Related Posts

Leave a Reply

Your email address will not be published.