This is the final article of a five-part series that covers email marketing essentials, which can be applied to optimize subscriber communication and increase return on investment.
Email marketing is an essential tool for communicating with your subscribers. It’s the most affordable way to drive traffic to your digital editions, remind readers of their subscription status, onboard new subscribers, and identify segments of your audience for specialized campaigns. When you look at the effectiveness and ROI of email compared to social media, it’s pretty clear that email is more effective. That’s not to say social media doesn’t have its place, but that’s an article for another day. Today, we’re going to look at email best practices.
In part four of this series, I covered which key performance indicators you should know and monitor. In this article, we’ll put that knowledge to use as we cover split testing, also known as A/B testing.
Part V: Split Testing
Split testing allows you to compare and contrast different elements of your emails to see how they impact the success of your campaign. Putting your emails to the test gives real statistics, allowing you to make data-driven decisions on what gets the most opens and clicks from your readers.
For example, you can change the subject line of an email and send it to a segment of your subscriber list. Then, send a different subject to another segment. Once you have determined which email performed better, you send the remaining subscribers the better performing subject line.
Understand Who You’re Testing
Most of the time you’ll be testing against your entire subscriber list. This allows you to get a more accurate picture of how your test. The larger your test sample, the more accurate your results will be. Also be sure the split is done randomly, as hand-picking segments can skew your results.
Before sending different versions of your email, you should first decide what elements you’ll be testing and what you consider success. If you’ve been using the same email campaign style for a long period, then you have a large pool of data to pull from.
For example, if you are testing your subject line then you’ll probably want to try shorter and longer versions or change the tone of the message. We cover how to optimize many of the elements you may want to test here.
Test Two Variants
The golden rule of split testing is to only test two variants at a time. For example, test two subject lines against each other to see which one performs better, then you can use the top performer to send to your full subscriber list.
Split testing is an ongoing process that requires commitment. Not giving a test sufficient time can lead to skewed results, as there’s not enough data to be statistically accurate. On the other hand, running a test for too long can skew the results as well because it may introduce uncontrolled variables. Diligent monitoring will ensure you are aware of any anomalies that may occur.
What to Test
You can test almost any element in your email campaign, but that doesn’t mean you should. You should focus instead on the things that are most likely to have a greater impact.
It’s recommended that you keep the “from” email address consistent, but changing the name that appears in the inbox can be tested to see if it impacts your metrics. For instance, is it better to use your name as the “from” name, or your company’s name?
The most commonly tested element of emails is usually the subject line. Some interesting things you can test in your subject lines include:
- Length – Test short subject lines vs. longer subject lines.
- Topic – Test two completely different topics as the subject line, to see what content is of most interest to subscribers.
- Personalization – Add personalization to identical subject lines to see if a first name greeting, for example, gets a better response.
- Promotions/Offers – See what kind of promotion works best by offering different promotions to different segments of your audience.
- Recognition- Including your company name in your subject line may increase engagement.
Preheader testing is a good opportunity to increase your metrics. Some interesting things you can use in your pre-header test include:
- Show/Hide Preheader – Test including a preheader and not including one to see if the version with the preheader has a higher open rate.
- Content – Test two different topics in your preheader and see which your subscribers respond best to.
- Personalization – If you have subscriber-specific data, try to incorporate it into the preheader. For example, “Only 3 months left on your subscription, Joe!” or “You only have 2 issues remaining, Jennifer!” may spur a renewal.
Create multiple versions of your mail template that incorporate different design elements. Typography, color palette, button colors, and even layouts can make a difference. I cover design optimization here.
Calls to Action
Focus on CTA placement, content, and design. Your call to action is the strongest, most important piece of content you will ever include in your email. Therefore, take time to consider its placement, wording, and design when conceiving your email messages.
There are many things you can test within the content of your email. It can be overwhelming, so start with one or two elements and iterate through your list. Some examples, are recipients more likely to click a linked image or linked text? Do recipients prefer a template that contains a GIF or one with static images? I cover content optimization in detail here.
Timing and Cadence
Testing the day or time you send your email is an amazing opportunity to figure out what works best for your subscribers. Does the time of day a campaign is sent affect the click rate? What day of the week gets better open rates? Studies have found that increasing or decreasing the amount of email you send can have a significant impact on click-through rates and unsubscribe rates.
Coordinated Channel Testing
You can also test multiple content channels in conjunction with each other. For example, you might want to test newsletter A with landing page A, and newsletter B with landing page B. And then, later, you may want to test newsletter A with landing page B, and vice versa. This can give you a more concrete result if you’re getting mixed results, or if your results are very close.
Optimizing your split testing strategy provides a powerful way for you to measure your email performance. Focusing on the strategies and tips discussed in this guide will help you as you continuously iterate and improve your email program.