While A/B testing is an everyday occurrence for us at PurpleFire, we know that this isn’t the case for everyone. Website optimisation is still in its infancy and is only beginning to gain the recognition it deserves from the wider digital industry. We often speak to potential clients and ask “Have you ever done any testing?” to which they reply “Yes, we ran some usability testing last year”. User research is absolutely the first step for any successful optimisation effort, but for many, this is where optimisation begins and ends.
Maybe you’re fed up of not being able to measure the impact of the changes you have made as a result of your user research, or maybe you’ve heard some of the current buzz around CRO and you’re keen to reap the rewards. Whatever it might be, running your first A/B test can be daunting. Here’s some pointers to help you ensure that your optimisation efforts are a success right from the start.
Table of contents:
Conversion Optimisation Maturity
If you’ve attended a CRO conference or done any reading on the subject, chances are you’ll have heard about the concept of maturity when it comes to businesses CRO capabilities. These range from those who have good intentions, but none of the necessary tools in place to make optimisation a success, to those who have a dedicated optimisation team in place. Take time to assess your current positioning and the factors that are playing for and against your optimisation success by considering the following;
Tools & Technology
Simply having a Google Analytics account isn’t enough to say you make data driven decisions in your business. Ensure you are making the most of your analytics package by checking its configuration and setting up additional relevant features, such as events or goals. This way, when it comes to running on site experiments, you can integrate your CRO testing tool with your analytics and run additional data analysis, learning more about how the test variation impacted user behaviour.
What skills and resource do you have to facilitate optimisation? If you’re going to tackle optimisation in-house, then you’ll need both strategic and technical resource, as well as someone who owns the project as part of their day to day responsibilities. Ideally, your optimisation team should include a range of roles from data analysts, user researchers, designers, and developers.
When looking to start A/B testing, it’s not advisable to go full steam ahead without a plan. This includes how you’ll come up with test ideas, how you’ll decide what to test first, how you’ll measure success and how you’ll implement changes on a permanent basis. Here at PurpleFire, we have our 6 steps to ensure we stick to the plan for every test, keeping things clear and consistent for our clients.
When running A/B tests, it is integral that you have the support of the wider business to ensure you are able to see the biggest impact from testing. One example is if your test ideas are pushed back due to internal politics, your business is not making the most of the opportunity. To build a test and learn the culture, you need an open-minded, innovative and knowledge thirsty team around you. Sharing test results and highlighting the wider business impact of tests is a really powerful way of gaining support and buy-in business wide.
Iterative vs. Innovative
Once you’ve determined that you are in a suitable position to begin running A/B tests, you need to start thinking about what to test. Chances are, you’ll have some internal ideas kicking around, but before testing these ideas, define why you’re testing them. One of the benefits of A/B testing is that it removes subjectivity, so don’t be bullied into changing the colour of your call-to-action buttons, just because someone in your business thinks you should. All test ideas should have a clear rationale from analytics (now that you’re getting the most out of your GA account!) or user research findings because ideally, test ideas should stem from multiple supporting research sources.
A/B testing can be grouped into two types; iterative and innovative. Iterative tests tend to be ongoing smaller changes and tweaks to your site, such as changes to copy or imagery. Innovative tests are larger scale tests, which could be anything from implementation of a new feature or piece of functionality, a full page redesign or even whole site redesign. While innovative tests have more potential impact, they also have more risk attached, which is why it’s advisable to start by running iterative tests. This will allow you to perfect your process and begin gathering learnings and insights about your visitors, which can then feed into future innovative tests.
Although it’s best to start on a small scale, you still want to see impressive results from your first few tests. This will help drum up excitement and support for optimisation in your business, so don’t shy away from running tests on key pages of your site; high traffic pages such as top landing pages or high value pages such as those within checkout. Focus initially on the areas with the most potential impact and don’t be scared to make bold changes if you have the rationale to back them up.
Once you’ve configured your first test within your testing tool, the final step is to conduct cross browser and device QA testing. The most effective way to conduct QA is to view the test in a live environment, but isolated to you so no ‘real’ visitors see the variation before it’s ready. There are a number of ways of to do this. Optimizely recommends setting a test cookie on your computer which can be used in targeting settings to prevent anyone seeing the experiment unless they have the cookie. You can then view the variation in a live environment and ensure it look and works as it should. Another useful hack is to use an IP address filter within your experiment, so you can put the test live but only machines within your business will be included in the test. When conducting QA, make sure you interact with any functionality that may be effected by the changes and test every possibility. For example, if the test includes a form, ensure you’ve considered error messaging and remember to check that all of your goals are firing correctly.
- Ensure you have the right building blocks in place to start optimising through A/B testing. If you don’t think your business is ready, focus your efforts on getting them ready before you think about testing.
- Test hypotheses that are supported by multiple sources of data and research, rather than those supported by personal opinion or gut instinct.
- Rome wasn’t built in a day; start with smaller but impactful changes on key pages to get to grips with your process and learn about your visitors.
- Be rigorous with QA testing to give you confidence in the user experience of the variation and your results.
You can also assess the maturity of your Conversion Optimisation efforts through Paul Rouke’s piece for Econsultancy here.