11 January 2011 by Web Bureau
When timelines get short, or budgets are tight, often the first thing that goes out the window is the process of testing email campaigns, such as maximizing your open rate with a proven subject line, or selecting a solid call to action.
Test #1: Picking an Effective Subject Line
Subject lines are by far the most popular element of an email design to test, for two reasons: a) designers are an indecisive lot, and b) it's so very easy to do. But perhaps more importantly, the subject line has a big part in determining whether the email gets opened or not. And, with more opens, you consequently get more clicks, too. By creating an A/B split campaign, it's possible to test both subject lines by sending two versions of an email to a small portion of subscribers. Once the test is complete, the most popular version is then automatically sent to the majority of subscribers - optimising open rates on your campaign!
Test #2: An Effective Call to Action
Although a little more legwork than running an A/B split on a subject line, testing email content such as a call to action or button design is a great way to generate more clicks and maybe even learn a bit about what elements work best in your email newsletter.
Test #3: Which Image Wins?
Another recommendation is to test (product) images: these can produce a big lift and are easy to do. Like A/B split testing, this also requires two versions of your email content so takes a little time to get set up!
We've Only Just Begun...
So, we've featured some fairly simple examples here, but A/B testing doesn't necessarily have to be about small changes you can also use A/B split tests to determine which email layouts work best, how tone of copy affects response and more. By testing two very different email designs, it is possible, for instance, to determine if a redesigned email newsletter is on the right track as a whole. But of course, this is quite different from determining which single CTA works best!
Also, it's a good idea to not look at results from your tests in silo, but set controls, look for trends and take into account the bigger picture. For example, variables such as the time of day or time of year can affect campaign results - especially now, during the holiday period - so learnings from June's campaigns may not be as applicable when sending in December.
Finally, as we've seen above, testing doesn't have to be a time-consuming process and ultimately, should be seen as an ongoing, measurable feature of your campaigns. Even if you're simply making small changes to your subject lines, at least you're making improvements... And learning a little something more each time you send.