Regardless of the details of your business, your email marketing program is undoubtedly a vital component of your go-to-market strategy. In today’s world of over-sending, increasingly intelligent SPAM traps, and fierce competition, it’s harder than ever to break through the clutter. You’re not the only one trying to figure out how you get subscribers, how you get them to engage and then how you make sure that email moves your bottom line. As with most strategies, testing is key to improvement and optimization.
Our clients have testing wish lists that are upwards of 100 rows long – but manpower, cadence and cost always pose limitations. That’s why I’ve hand-picked 3 email tests that we’ve found successful in increasing performance – to help you address your biggest challenge(s) with the least effort and in the shortest amount of time. Of course, there’s more where these came from – you just have to ask.
3 Email Tests You Have to Try
1) Subject Line: Direct vs. Ambiguous
Email is a perfect venue to be creative. You hope that a catchy, out-of-the-box subject line will help ensure that your email stands out among the rest and creates interest. However, you also want to spark interest by speaking to the great content you’ll find in your email, and you might not want to beat around the bush. Each audience is different, and you want to find the right mix for you.
While many subject line tests are easy to complete, this one requires some extra forethought and should be conducted multiple times. Develop one creative with consistent messaging, images, offers and call-to-actions and limit only your subject line as the variable. This is easily done using your ESP technology, using that interface to split your test segments into equal groups with a randomized mix of attributes.
The key to a subject line test is to ensure that you’re isolating the variable within your subject line – tricky considering your character limitations. Here is an example:
Notice that the words used are as similar as possible, while still creating a clear differentiation between a direct call-out and a teaser. You may find, like we have, that the use of these subject lines varies for your audience, as well as the subject at hand, and you can start to identify patterns.
To evaluate success, look both at open rate and click-through rate. An ambiguous subject line may get someone to open, but create disconnect with the actual topic, leading to lower click-throughs – something that should be taken into careful consideration.
2) Send Schedule: Day of Week (and Time of Day)
This test is tricky but has the potential to lead to a huge reward. Deciphering the open behavior of your customers and where your brand fits into their week and day is a powerful step in optimizing your program. The key here is to ensure you are isolating all variables, which is why testing time of day and day of week at the same time is a great approach, as long as you have the audience to sustain numerous test groups while enabling statistical significance. Follow these steps to find the optimal send time for your audience:
- Determine Your Maximum Number of Test Groups. Based on capacity and list size, how many test groups can you support? This will determine how granular you can be in your distribution of dates and times tested. For example, with a small list size, you may only be able to test 2 different times each day (leading to a total of 14 groups across all 7 days).
- Determine Dates & Times to Test. The purest of tests would evaluate a send at every hour on every day, but this is highly improbable due to operational limitations. Instead, many of our clients have made decisions to not test days they’ve known to be low performers in the past, or pick intervals, such as every 4, 6 or 8 hours, to keep the number of groups needed down.
- Create Your Testing Schedule & Deploy. Establish a testing grid with all of the dates and times you’ll be testing and the test groups in each. Ensure the test groups are equal and varied in attributes and previous engagement rates. It is crucial that any throttling or other external factors are minimized. All creative and subject lines should be identical across the groups.
After your test is executed, creating heat maps of the various open rates and click rates across the groups will lead you to your optimal send. Often times, we find that there are a few blocks across consecutive days at the same time (e.g., 8pm on Saturday and Sunday) that perform best, enabling flexibility to vary sends in the future based on business objectives.
It’s important to look at multiple metrics to ensure the optimal send across the board is selected – open rates, click rates, website engagement and conversions. It should be noted that a truly optimized email schedule would be able to target each individual user on the day and at the time that maximizes their opening potential. But that is still an operational challenge for many.
3) Creative: Image Type
This is a fairly simple test you can complete that has shown quite an uplift for our clients in terms of click-to-open rate (our preferred metric when looking solely at creative performance). If you’re just beginning to test your emails, we do recommend you begin by testing into your overall email layouts, but programs across the board can benefit by this test. After all, visuals are processed 60,000 times faster in the brain than text, making the images you choose a key component to your success.
This test requires a bit of forethought and flexibility. Here’s what you need:
- A Diverse Topic: You need to choose an item within your content strategy for the email focus that can be shown using lifestyle images, product images, images with people, images with backgrounds, etc. Basically, whatever variable you think would make the most impact. For one of our clients, this was including people (lifestyle) vs. objects (product).
- Valid Test Images: You also need to ensure that you can find images that can be seen as similar in terms of colors, shapes, etc. You don’t want to get to the end of the test realizing that creative A could have succeeded just because it had an image hue that was drastically different from the CTA, creating an eye-catching contrast.
As with all other tests, ensure all other aspects are identical prior to deploying (e.g., subject line, header, CTA, copy). This test performance should be judged primarily on click-to-open rate as the isolated variable impacts the quantity of click-throughs achieved.
Don’t limit yourself to these tests, but they should give you a good start! You can apply the same testing approaches to address other items on your testing wish list. When deciding which test to move forward with, consider:
- Will this test really move the needle? Use your judgment to think through which tests will have a greater impact than others. No test is a waste of time, but your money and staff resources are valuable, so prioritize your testing plan as such. You also want to ensure that you’re setting yourself up for success in terms of test design (length of time, number of sends, testing size) and enable statistically significant results.
- Will the results be actionable? Ensure before you start that you can operationalize your learnings. Are you hoping to establish a directional learning for your full list or an optimization on a user-level? And, if the latter, can you support that with your current email program and technology?
- Does the objective of the test go beyond the current email send? Will you be able to apply the learnings to multiple emails or are you limited in the application? If you have doubts, you may want to consider a one-time A/B test, sending to a small sub-set of the group and then sending whichever email mix performs best to the group at large. That enables you to ensure high performance of one important email as opposed to gathering learnings to apply to your full program.
Let me know what other tests you’ve found helpful and post any questions you have below. Happy sending!