Thinking of making changes to your email template? Are you starting a new automated campaign and want to put your best foot forward? A/B split testing lets you test hypotheses and then choose the most compelling variation.
Split testing in the Bronto Marketing Platform can be automated to remove some of the tedious steps and provides access to valuable testing data. I’ve put together a couple of tips and techniques that will soon get you testing like a pro.
Take baby steps. If you are trying your first split test, don’t over-complicate your testing plan. Start with something you can quickly test, such as the subject line, hero image or call-to-action placement. Once you’ve seen some results, you can build a more advanced test.
Don’t try to boil the ocean. It’s easy to get carried away, but don’t try to test too many things at once. The versions you are comparing should have minimal differences between them so that you know why one message outperformed the other. Change only one thing at a time and run several splits over a few deliveries.
Don’t throw out the baby with the bath. Do you have a standard template that you think might need a pick-me-up? Try a champion/challenger test. The champion is your standard template. The challenger is a template you’ve tweaked (remember, there should be minimal differences between the two). When you’re ready to send, send the champion to 90% of the contacts you’ve designated for this delivery. The other 10% should get the challenger. Compare the results to see which choice performed best. (For this test, the percentages must add up to 100% of your delivery group.)
Winner, winner, chicken dinner. Have you ever had a subject line that seemed amazing, but you didn’t see the opens you expected? Or maybe you and your colleagues disagree on the color of the call-to-action button. You can use an A/B winner split test to find the most effective choice and then automatically send it to the rest of your list.
You can create a maximum of 20 split groups, but with this example, let’s say you only want to test two versions of the subject line so, you create two groups. For A/B winner split tests, try to keep the total for your test groups to under 50% of your list. This ensures that your winner group will be the largest group (e.g. If A and B are each 20% of your list, the winning email will go to 80% of your list.). You determine the send time for the test groups, the winning criteria (e.g. highest open rate, highest conversion rate or highest revenue per contact) and the time the winning email will go to the rest of the list.
What’s in a name? More than you might think. Testing the from name or address are best for a new list or mailing. It’s not a good idea to run these tests on an established list as the change can cause confusion for your contacts and even cause you to lose inbox whitelisting with some ISPs.
Don’t give up if you’re not seeing results. Don’t assume you’ll see dramatic results. But there’s always something to learn from your tests. For instance, if you’ve tested the placement of your call to action button several times, but don’t see a dramatic shift in click-through rate, you’ve probably identified that the placement of your call to action doesn’t affect the click-through rate for your subscribers. Now you can try something different – like changing the color of the button or the button copy.
You may be surprised at how simple design tweaks, such as order and positioning of links and images, affects the number of clicks your message receives. For step-by-step instructions on how to set up and run an A/B split test, visit Bronto Help > Send An A/B Split Test.