Achieve higher email open rates with these surefire A/B testing tips

A/B testing your emails is one way to find out what your audience wants instead of shooting in the dark. Testing will help if your emails are failing to deliver on your objectives: higher open, click-through, or reply rates.  In simple terms, A/B testing is the process of sending two slightly different versions – Email A and Email B – to two groups of recipients. The version, which secures more opens and clicks is obviously the winner and will be shared with the wider audience.

When testing, send the two versions to only about 30% of your database. Send Email A to half of the recipients in this test group and Email B to the other half.  You should send the emails to the smallest portion of your database while ensuring that the number is statistically significant. You can also use online calculators to decide on the sample size:, Cardinal Path and Optimizely.

Most email marketing platforms offer A/B testing as an inbuilt feature and tests can be run pretty much effortlessly. It’s worth putting in the time to test as emails still continue to perform as a powerful marketing tool. According to Campaign Monitor email marketing is the king of the marketing kingdom with a 3800% ROI and $38 for every $1 spent.

So, what are the elements in the email that you can alter for testing purposes? You can test variations of the following:

  • Copy of the email
  • Subject of the email
  • Call-to-action (CTA) button text, color and position
  • Images and placement

Always limit the change to one variable – email copy, subject, CTA etc – so that you can easily understand what has worked. If you change everything in one go, it will not be clear, which variable has contributed to the performance boost. The downside is that you may have to potentially test all the variables separately if the targeted results are not forthcoming. The more you test the better.

How do you interpret and use the results?

Suppose Email B, with a green CTA button gets 20% more clicks than Email A with a blue button; it is clear that Email B is more effective and should be sent to the entire database. You also need to keep an eye on your Google Analytics to understand what the visitors from your email campaigns are doing on your website – leaving immediately, visiting other pages, or making a purchase. If for instance, you are using campaign monitor to manage your email campaigns, click here for instructions on how to link your email campaigns with Google Analytics. Web analytics will help you understand if the messages are effective in triggering the desired action and if they resonate with your target audience.   If the results are bad, then you need to go back to the drawing board.

Here are a list of different A/B tests that you can run:

Subject line

  • See what makes for a more effective subject line –  a question or a statement
  • The character count of the subject can also make a difference. Test both long and short subject lines. Try to keep it below 70 characters.
  • Will capitalizing all the words make difference? Perhaps you may get better results by using a sentence case in the subject line.
  • Test to see what happens when you mention the recipient’s name in the subject line.
  • Frame the subject line with words that evoke emotions such as curiosity, urgency, happiness, sense of peace etc and develop another version using words with low emotional value
  • Rearrange the order of the words in your subject line to see if it improves the open rate

Body copy

  • Try variations with longer and shorter body copy
  • Try to personalize the body copy by including information that may be of specific interest to the recipient. Send another version without personalizing to see how it compares.
  • Use visuals as they tend to get processed faster than text
  • Keeping the tone positive and upbeat can help to increase your email conversion rate

Call to Action

  • Try placing the CTA towards the top of the email, the middle, or the bottom
  • Make the CTA bold, italics or underlined and see if there is a difference
  • Use a colored button
  • Test one version with very clear, precise and action oriented words in the CTA copy and another with generic copy such as ‘click here’ or ‘read more’

Different types of content

Try variations of content to see what works best – videos, podcasts, blogs, ebooks, case studies, PDFs etc. Use your email marketing platform to gauge attachment opens.


  • Change the character count, formatting (bold, italics, colored) and position (part of a paragraph or on its own line) of hyperlinks in the body copy to see the impact
  • Try using one version with a single hyperlink  and another with multiple hyperlinks

Document learning for the future

When you run A/B tests for a while, you will begin to see patterns that contribute to higher email conversion rates.  These insights can help to inform your future campaigns. Hence, make it a point to judiciously document the findings from each A/B test including what worked and what didn’t.