Hi there marketing mavens. Today’s edition of Email Marketing 101 is all about A/B testing. Here we go …
You’ve spent days – if not weeks – crafting your message, finding perfect images to complement it and developing a pithy subject line. And, finally, off it goes – on its way to tens of thousands of inboxes.
Well, not so fast. Chances are very good that most of those inboxes your message is heading to are as crowded as yours. So as soon as your gem lands, it’s desperately competing with perhaps hundreds of other messages for attention.
Research says that if your message hasn’t grabbed your recipient’s attention within three or four seconds, it won’t be opened. And so all that tremendous potential vanishes, either with one click of the “delete” button or simply by being ignored.
When stakes are high, you want to do everything you can to make sure your message gets the attention it deserves. And one of the keys to doing that is A/B testing (also called split testing), which can help you determine and drive your most effective emails to the largest portion of your audience.
A/B testing is a simple way to send and compare two versions of an email against each other to decide which one performs better.
So what’s the best way to go about it? Some of our email experts here have crafted a few best practices for effective A/B testing:
- Choose an appropriate number of variations based on list size and open rate and develop a hypothesis for each variation.
- Test impactful outcomes by varying the elements that directly impact performance.
- Don’t muddy the waters: the more variables you test simultaneously, the less confident you can be on what caused the lift.
- Be flexible and allow time for meaningful results to develop. Significant results could -take weeks…or 20 minutes.
- Allow for attribution back to the email if your sales cycle is long, or if the use is not e-commerce – for example, if your message includes coupons for a bricks and mortar location.
- Don’t forget about the halo effect of a module or hero – just because it’s not getting the clicks, doesn’t mean it’s not influencing the email’s impact.
Once you have your variations set and your test parameters in place, you are ready to go. And, fortunately, as an email marketer, you don’t have to retreat to a laboratory for testing. Your laboratory is the real world, which includes, of course, getting real results from real customers.
If you are concerned that the most-successful versions of your messages are going to miss customers, you can put that worry to rest. Movable Ink’s creative optimization takes A/B testing to entirely different level.
With it, you create two variables, Option A and Option B, and divide your list in half. Send Option A to one half of the list and Option B to the other half. If Option B starts outperforming Option A at a statistically significant level, all Option A emails automatically switch to Option B. Even if customers have opened the email already, they’ll see Option B if they open it again.
Below are a few examples of creative optimization results.
Test: Mobile Optimization
Bass Pro tested a mobile optimized layout versus the traditional desktop treatment for mobile emails.
Mobile Optimization: 28.5% increase in mobile CTRs
Percentage copy: 82% lift in CTR
Finish Line tested the effectiveness of product shots vs. lifestyle imagery
Lifestyle imagery: 20.37-percent lift in CTR