Email A/B Testing Is a Marketing and Sales Superpower: Here’s How to Use It

Email A/B Testing Is a Marketing and Sales Superpower: Here’s How to Use It

Email is the bread and butter of marketing. It’s how we engage with our existing customers and leads, and often how we get cold leads to become aware of our offerings and turn them into opportunities. But how do we know our emails are effective? We A/B test.

In marketing, A/B testing is a science experiment. It takes two variants of some marketing material—whether that be landing pages, ads, email, or something else—and measures which perform better.

To get these results, users are shown the different variations (A or B) at random, then conversion rates of the desired outcome are measured against each other. This could be open rate, click rate, form fills, SEO success, or any success metric that you’re aiming for.

A/B testing is incredibly important for sales and marketing teams. Without it, they could be running mediocre campaigns for years on end and have no idea. Money could be getting left on the table! That’s why it’s important to Always Be Testing (ABT)...Well, maybe you don’t need another acronym to deal with, but you get the idea.

Why Is A/B Testing Important for Emails? A Marketer’s Best Friend

Why Is A/B Testing Important for Emails? A Marketer’s Best Friend

Marketers (at least good ones) should love to run experiments and record meaningful results. It’s really the purpose of marketing…to find what works and do more of it. For email campaigns, A/B testing is how we achieve that goal.

For your email marketing strategy, A/B testing is a measurable, scientific method to improve the following in your campaigns:

  • Open rate: The percentage of emails opened out of emails sent
  • Click-through rate: The percentage of emails that have a link in the email clicked
  • Conversion rate: The percentage of emails that result in a successful conversion (e.g., getting a response or scheduling a meeting)

While there are many variables to test (more on that below), these are the three main success metrics you will use in your email A/B testing. Through your carefully crafted tests, you will be able to find what works and what doesn’t for these three important factors.

The end result will be higher open rates, higher click-through rates, and better conversions. With proper A/B testing, the ROI on your email campaigns will go through the roof (and your sales team will like you a whole lot more).

A/B Email Testing Is an Outbound Salesperson’s Friend, Too

Outside of marketing emails, outbound sales teams can also experiment with A/B testing in cold emails using the same three metrics. The variables may be a bit different than marketing emails, but the benefits are similar.

For example, you may want to experiment with different cold email templates such as asking for an introduction, referring to recent events, or addressing competitor pain points that your company solves. There is so much to play with here, but testing these templates against each other can lead to huge wins in what can be a tough game.

In the world of sales, successful email outreach hinges on effective email deliverability. Explore the steps to achieve this through proper technical email setup.

And if you are seeking inspiration for your B2B cold email strategies, check out our article about 12 B2B Cold Email Templates to discover how a well-structured b2b cold email template can make a difference.

How to Choose the Best A/B Testing Elements

For effective A/B email testing, you need to know what, exactly, to test. When choosing a variable, consider your overall goal. For example, if you want to increase open rates, then you might test send time or subject lines. If your goal is to increase conversions, then CTAs, images, or layouts are a good place to start.

For most companies, the following variables are the easiest and most impactful to test and measure results for.

Subject Lines

Try a variety of subject line strategies, such as making your subject a question or a statement, using emojis in the email subject line, creating a sense of urgency, using statistics, being mysterious vs. being direct, etc. You may be surprised by what works and what doesn't. If you’re struggling to come up with variations, try our email subject line generatorit's powered by ChatGPT and trained on our most effective email subject lines.

Pro tip: Here’s a pro tip that I always go for, use sendcheckit.com to test your subject lines before sending them. You’ll get a grade on how well you can expect it to perform and be informed if you're using any words that are likely to land you in the spam folder.

Personalized Names

You can play around with using a sender name in the email signature or making the email "from your company" rather than an individual. You can also experiment with using the person’s name in the email copy. I usually go with "Hi (name)."

Depending on the context and audience, personalized names will either help or hurt. If it’s obvious it’s not coming from an individual, then maybe it doesn’t make sense to use the subscriber’s name. If not, try getting more personal.

Images, Design, and Layout

An interesting test is to see if a plain text email performs better than an intricately designed visual email. The simple text email seems like it's from a real person, but the well-designed one is more visually stimulating and may get more engagement.

Within email design, you can also play with different images, designs, and layouts, creating an "A, B, and C" version and seeing which one performs best. Again, some emails come off better as a well-designed masterpiece, others come off better as a simple "me-to-you" text that you would send a coworker or friend. See what works for you.

Call-to-action (CTA) Variations

More often than not, you’re going to measure success by how often your CTA button gets clicked—but how do you make sure the CTA is inviting itself to be clicked and not scaring people away? Here, you can play with elements like using arrows, buttons vs. hyperlinked text, playing around with fonts and colors of the buttons (ALL CAPS PERHAPS?), and more.

Preview Text Variations

Often overlooked by marketers, preview text (also called "preheader’) appears next to the subject line in the inbox. It’s the second thing most readers will see, and it can make or break it when it comes to getting clicks. When ignored, preview text just defaults to the beginning of your email, but you can include whatever you want.

Try different variations to entice the reader to click. Think of it as an extension of your subject line. Maybe include some stats, pose a question, or hit on the hardest pain point, then see what works.

How to Choose the Best A/B Testing Elements - Preview Text Variations
See how the preview text appears after the subject line in my "promotions" Gmail tab, and note that each of these companies took the time to include preview text.

Email Copy Variations

Of course, you’ll probably want to test different versions of your email. There are infinite ways to say the same thing with different words, so test different emotions, pain points, and benefits within your emails to see what works best for your audience. Do they respond to hard emotional copy or get turned off? Do hard stats perform the best?

Try a couple of different variations of the same email, but don’t make them too different because then you won’t know which change is making the impact. If you’re writing very different emails, consider using them as separate emails in the same campaign instead.

Send Time

Another simple A/B test is to vary what time and day of the week you send your emails. You may find that email open rates are better at 9 am versus 2 pm, or that people are most likely to unsubscribe on a Friday for some reason. It’s a simple test you can run on the exact same email to see what works best.

→ Need some fresh email copy ideas? Check out our amazing, AI-powered cold email generator and get your creative juices flowing.

One Variable at a Time!

Don’t go too crazy when testing, as you may get lost in the weeds. When you test many variables, you don’t know which ones are making the difference. Instead, focus on one variable at a time. Try the same email with the designs and images vs. text-based, change just the subject line but nothing else, try one with preview text and one without, try the CTA as a button vs. a link, etc.

Don’t do all these things simultaneously because you won’t know what variable makes the difference.

Build on Data and Insights from Previous A/B Tests

Because you’re testing one variable at a time, you’ll want to save all the data from each test and use it to create hypotheses for new tests. Essentially, you’re building on your learnings from each test to build a rich tapestry of results that inform your future campaigns.

This helps you get around the "only testing one variable at a time" problem and allows you to see how all variables have worked together in the past. Use your rich library (or spreadsheet) of results to help your writers create more effective email copy and pitch better ideas for future A/B tests.

As we delve into the specifics of setting up A/B tests for emails, it's important to have the right tools at your disposal. Our AI Email Writer tool can provide you with a solid starting point, offering customizable email templates that are optimized for engagement and conversion. Leveraging AI, the tool helps you generate emails that resonate with your audience, which can then be further refined through A/B testing.

How to Set up an A/B Test for Emails

Now that you’ve figured out some variables to test (although one at a time of course), you’ll want to set up an A/B testing email marketing campaign in your customer relationship management (CRM) system. To execute this, you can follow the simple steps below:

Define Clear Goals and Objectives for Your A/B Test

You should have a solid understanding of what you’re hoping to achieve with this A/B test and a hypothesis for how it will turn out. Your goal could be one of many things, for example, increasing the open rate, increasing CTA clicks, scheduling more demos, or starting more conversations (i.e., response rate).

Write out a clear hypothesis that includes what you hope to achieve, why you think you will achieve it, and a benchmark for measuring success. If you’re measuring the open rate, a common success rate is 35 percent. For response rates, 10 percent is considered solid, and anything less than 5 percent usually means you need to do some work.

Create Two or More Variants for the Selected Variable

Once you’ve chosen a variable to test (e.g., preview text, subject lines, or CTA variations), you’ll need to make a version A and B. You aren’t limited to just two versions, however. As long as you’re testing the same variable, you can test as many versions as you’d like.

Maybe you’re testing using compelling statistics in the headline and you have several stats you can choose from. You’ll want to choose one email without stats, and maybe two or more with them. Here are some examples you could choose from (warning: these stats are made up by me and not real, just an example):

Version A: Learn how email A/B tests help marketers get better results
Version B: Learn how A/B email testing helps marketers increase open rates by 47 percent

Version C: Learn how A/B email testing helps marketers increase response rates by 34 percent

Split testing these three closely related but different subject lines will give you a clear understanding of whether or not using a stat makes a difference, and which of these two stats is most compelling if it does. You can use these results for future emails to other segments or campaigns.

Segment Your Email List the Smart Way

When breaking up your email list into segments, you want to make sure that the different segments within your list are as equal to each other as possible. Otherwise, your results could end up skewed towards one demographic and not accurate as a whole.

Use all the contact data you have in your CRM to distribute as evenly as possible. You should have data on your contact’s company name, industries, levels of seniority, job titles, location, and more. Create lists that evenly distribute your contacts based on these factors.

For example, make sure you have an equal number of people with "VP and CEO" in each segment. If you’re working in the US, try to evenly distribute across geographic areas. Don’t have an "east coast" and "west coast" list, as your results would end up being culturally skewed.

You might not get this perfect, which is fine, but try to make sure your list isn’t obviously biased or skewed in one direction or another to ensure statistical significance.

Track Your Results

Once your goals, variables, segments, and split tests are set up, it’s time to deploy your A/B test and measure the results. If you’re using the Close CRM email communication tools, you’ll be able to set up email automation and easily track response rates, open rates, and other key metrics for all of your email sequences.

Do the results match your hypothesis or end up surprising you in some way? It’s funny sometimes how things work out in the real world vs. in our minds, and A/B email testing is sure to blow up some of your assumptions.

Lastly, use these results to inform future email A/B tests and email copy in general. For example, if you know that using stats works better in email subject lines, keep doing more of that. Using a spreadsheet or project management tool to track your results is ideal, as you can easily update and look back on them when you need insights and are setting up new A/B tests.

How to Set up an A/B Test for Emails - Track Your Results

Best Practices for Email A/B Testing That Ensure Impactful Results

Email testing done poorly is not only a waste of time, but it can throw you off from the cold hard truth. In other words, it’s not accurate so the results aren’t real or meaningful in any way. Following those results would lead to worse outcomes, not better ones.

To ensure that you always have impactful results from your email A/B tests, follow these simple best practices:

Conduct Tests on a Significant Sample Size

If you’re only testing emails on a handful of people, you won’t get statistically significant results. Make sure you’re testing on hundreds or thousands of people, rather than just a few dozen. At a larger scale, the results you get will be much more impactful and useful.

Run A/B Tests for a Long Enough Duration

To capture reliable data, you’ll need to let your email campaigns brew for a bit. Don’t just run them for one or two days and think you’re done. Let them go for weeks or months and watch the results roll in over time.

Correctly Interpreting Insights from the A/B Test Results

In A/B email testing, the common fallacy of "correlation equals causation" can come into effect. Don’t immediately assume that because X happened it’s what caused Y. Look more deeply at your results and be sure before you make assumptions about your results.

Avoid Bias

This relates to "segmenting your list the smart way" above. You need to be sure that you’re not using one segment that is biased in one way or another. That means paying close attention to your segmented lists and ensuring that they are as equal to one another as possible.

Use Phone Calls to Bolster Your Results

If you follow up email campaigns with a call, you can use that call as a chance to get more data on why they responded (or didn’t respond) to your emails. If they opened it, you could ask, “I’m sure you get hundreds of cold emails each day. I’m curious, why did my cold email reach you? What piqued your interest? Why did you decide to respond to it—what did you like about it?”

This will give you more qualitative results than quantitative, but you can use these anecdotes to fuel new ideas, insights, and strategies that pure numbers can’t provide. Add a "notes" column to your A/B testing results spreadsheet and jot some notes down from these conversations for future use.

Email A/B Testing Helps You Learn, Grow, and Get Unstoppable Results.

Email A/B testing is like a never-ending science experiment that gradually sharpens your marketing or outbound sales efforts. With the incorporation of valuable outbound sales tools, this testing becomes even more impactful, enabling you to continuously enhance your approach.

It’s a true use case of "continuous improvement" that you can build on for greater optimization over time.

To get real, impactful data that can help you grow your business, start using email A/B testing as soon as you can.

→Need some ideas for your email campaigns? Check out our free Sales Email Sequence Templates and get 47 pre-formatted email sequences you can plug right into your CRM today.

CHECK OUT OUR SALES EMAIL SEQUENCE TEMPLATES→