Do you have lots of ideas about what can be changed on your site but don't really know what will give the best results? Then A/B testing is something to get started with.
Doing an A/B test is not that difficult. What is a bit tricky, however, is deciding what you should test and why.
Things you can A/B test can include headlines, CTA color, subject lines in an email, or a long form vs. a short form.
It is usually said that you should have a couple of hundred conversions per month to be able to perform an A/B test.
Statistical significance is about understanding the impact of chance on your result. For example, if you have chosen a 95% confidence level and you have a winner in your experiment, it means that you can be 95% sure that your result was not caused by chance, but that there is actually a difference between the two variants you tested. If you have too few conversions, you will not be able to get a statistically significant result.
If you test several things at once, the risk is high that you won't know which of your changes affected the result. That said, you can still make several changes at once. Confusing? 🙃 Let me explain.
If your changes are expected to affect the user's behavior in the same way, you can test them all in the same test. For example, you can test changing the position of several images on the same page, but you should not change the headline AND add a CTA on the same page.*
Testing one change at a time can give you better insight into the effect of each change.
A hypothesis should be formulated based on the identified problem and what the test is supposed to lead to. If you were to write a hypothesis for a form, your hypothesis could be formulated like this:
"We have seen a high drop-off rate (70%) from our form, and therefore we want to do an A/B test that involves removing two fields from the form. This will give us a lower drop-off rate, which will lead to more signups."