As I often say, there is no golden rule that works in any circumstances, and we should always check our hypothesis versus reality and allocate a fraction of our budget to test them.
Email marketing platforms offer convenient tools to test different versions of our campaigns: in Mailchimp, both Essential and Standard plans allow us to:
With Premium plans, we can also test multiple variables together at a time.
In order to get statistically significant results, you have to test a large enough sample, and the kind of test you should run depends on your goals.
I use A/B testing in different ways depending on audience size.
I asked my fellow Mailchimp partners for their tips and good practices about split testing, let’s read their advice.
Split testing requires volume, and many people we work with don’t have the volume (yet) to make it worthwhile. For those that do, we try and make sure we understand where the problem is that they’re solving: testing is fine if you know why you’re testing.
Robin Adams
So if no one is opening, subject lines and who it’s from is a good test.
If they’re opening and not engaging, then content and design.
Remember: testing something that is already working vs something you know isn’t, it’s probably not a good use of time and resources!
Owner and Founder ChimpAnswers
With lists that justify it, I try to A/B split test emails regularly. Very small lists are difficult to test with accuracy, so if the list is under 5,000, I use it sparingly. But for those with an audience that offers a good sampling opportunity, I like to test three critical areas: Subject Lines, Visuals and overall length of the campaign creative.
MaryAnn Pfeiffer
Subject lines are critical to test because small changes can result in big differences in open rate: testing personalization in the subject line, different buzzwords related to the content or industry, and even a strong call to action versus a mysterious lure can give great insight to what your audience really enjoys.
Image differences can lead to a big sway in click rates, as can the length of the campaign creative, so both offer an excellent opportunity to make a big impact with just a subtle change.
Digital Marketing Strategist 108 Degrees Digital Marketing
Subject lines are a good place to start. Small tests are the key to split testing, as you want to find the unique elements that are making the impact and truly see what’s going on in the inbox.
Doug Dennison
Run the same test 3-4 times so you have a clear result, then move on to another small test.
Get into the habit of running small incremental tests with every email you send out.
CEO & Co-founder MailNinja
We use A/B and multivariate testing for our client campaigns.
Adam Q. Holden-Bache
We mostly test subject lines as we are constantly looking to improve engagement, and we will also test offers, layouts, and CTAs. For subject line testing we will use opens as our metric to determine a winner, but for offers, layouts, and CTA testing we will use conversions to determine the winning option.
Director of Email Marketing Enventys Partners
I do split test subject lines when I’m not sure about my audience preferences. I write a clickbait subject line and a reusable subject line. For example: “How to keep mice out of your RV.” (clickbait) VS “News of the week from RVLifestyle” (reusable). Some audiences prefer the more generic subject line because it’s recognizable each week.
Amy Hall
Email Marketing Strategist and Certified Mailchimp Partner and Consultant Amy Hall
A/B testing or split testing is a no-brainer. I believe that every email you send should be set up as an A/B test. Why not try to understand your subscribers better with each send?
Emily Ryan
Testing send times, from names, subject lines and content can also be pretty fascinating. What you think might be working, may prove to be wrong.
Specifically, I love testing content. For example, you have two very similar email designs, one might have a photo and one might have an illustration: you can test and see which email performs better in terms of clicks. You might be surprised by the results.
We A/B test nearly every client, especially our newer clients so that we can start to determine a great email send time, a “from name” that works best (for example, is “Emily Ryan” or “Emily from Westfield Creative” better) and what types of subject lines their subscribers prefer.
Big, big fans of split testing.
Co-Founder and Email Strategist Westfield Creative
A/B Testing. Always. Be. Testing.
Glenn Edley
We test the subject line of every email we send. Subject lines are the first thing your readers or customers see and it’s the one thing that gets your email opened. I’ll repeat that. It’s the one thing that gets your email opened. So, you can A/B test two different subject lines to see which email will get more opens.
Or you could test different content / offers / deals or a Call To Action button to see which one will get more click-throughs.
Platforms like Mailchimp makes this super easy and it should be done with every email.
Top tip: write three to five subject lines for each email; pick the ones that you think your customers would like best; setup the A/B test to test your theory; then send the winning subject line plus a new one to the people who didn’t open your email the first time.
Always. Be. Testing. That is the only way to maximize the response from each email and so maximize the lifetime value of your customers.
Director & Email Strategist Spike
The key to improvement is never stopping to ask questions: yesterday’s answers could not be the right ones today (or tomorrow), and the only way to know is to try and test every day. So, let’s go test together!