By continuing to use this site you agree to our Cookies Policy.

The Best A/B Testing Tools on the Market, Ranked by Growth Marketing Experts

The Best A/B Testing Tools on the Market, Ranked by Growth Marketing Experts
Table of Contents
  1. Template item

Let’s say you need to choose something major, like the pricing model for a new product, or something smaller, like the size of text on a mobile prompt

How do you make that business decision? 

  1. Talk it through with the team. 
  2. Look at what my competitors are up to.
  3. Test a few options and wait for results. 

A lot of companies go with option A.

“Most companies make decisions based on the loudest or most charismatic person in the room,”  Delivering Value founder Andrew Capland told MarketerHire. “And they don't always know the right answer.”

So, who does know the right answer? 

Your customers. 

“A/B testing is a great way to use the data and let your customers make the decision,” Capland said. 

How digital marketers use A/B tests

A/B testing isn’t unique to digital marketing. In fact, farmers developed what we now think of as A/B testing in the 1920s, to experiment with fertilization methods and crop placement. 

Since the 1990s, though, the term “A/B testing” has been associated with the digital realm, and especially with marketing. 

An A/B test, sometimes called a split test, is a randomized experiment. Basically, you have two versions of something — this could be a website layout, a button, a call to action — that we’ll call versions A and B. A group of randomly-selected users experience A, while another randomly selected group experiences B. 

Then you look at which group has a higher conversion rate. 

“The purpose of A/B testing is to figure out exactly what’s going to work for your audience,” growth marketer Katelyn Glass told MarketerHire. 

“The purpose of A/B testing is to figure out exactly what’s going to work for your audience."

It’s central to modern digital marketing. In fact, it’s what takes marketing out of “the Mad Men days” and into the present, Glass said. 

Back in the 1960s, “you would pay a million dollars to get your products in a magazine spread, but you would never really know the results,” she explained. 

It was hard to precisely measure offline conversions. But online, where more and more transactions take place, it’s a different story.

“Now, I can concretely go to a creative director and say, this ad campaign didn’t work, and I know that because [another campaign] worked a lot better,” Glass said. 

Nine times out of 10, in digital marketing, A/B or split testing means trying out different versions or templates on your creative, she noted. For instance, you might test a product page with one image against a product page with three images, and see which one leads to more sales.

Source: VWO

But you can A/B test nearly every aspect of digital marketing efforts — Facebook ad timing, email sender names, or user experience on the website. 

Ultimately, you’re optimizing for user behavior, Capland said. “A growth marketer is interested in doing A/B testing to drive more installs or sign-ups, or some number of people doing something.” 

“A growth marketer is interested in doing A/B testing to drive more installs or sign-ups, or some number of people doing something.” 

So how do you do that? With A/B testing software. It’s key to a digital marketer’s tech stack — and that’s especially true for growth marketers. To find out which tools are best, we reached out to the experts.

Meet the experts

  • Katelyn Glass, founder and managing partner of e-commerce and marketing agency Fifty Six, and former COO of Rowing Blazers
  • Andrew Capland, founder of Delivering Value and former head of growth at Postscript
  • 20+ growth marketers in MarketerHire’s network

When — and why — you should run A/B tests

At every stage of a company’s growth, A/B tests are valuable for validating business decisions that will impact multiple teams or workflows in the long term. 

In fact, our experts recommend making A/B testing a regular practice. 

“Consumer behaviors change, product mixes change, trends change — so your test results will, too,” said marketing consultant Rose Mayo. “Determine a good cadence for core tests and rerun them at least once a year.”

Here are a few key situations where you’ll know it’s time to A/B test:

  • Your conversion rates are low. If you’re getting a ton of traffic, but not a ton of installs, “that’s a great time to start doing some experimentation to figure out why,” Capland said.
  • You’re running the same creative on every social platform. “The creative you show on email [should be] different from your creative on paid social,” Glass said. But you have to test to find out which font size or photography style is right for each platform and target audience.
  • Your decision-making process stops at asking the founder what to do. You should stress test founders’ hot takes before putting budget behind them. 
  • You’re launching something new. “An A/B test is super valuable to validate things before you invest too deeply,” Capland said. Before you spend weeks in product development, test a minimum viable product to figure out if further investment is worth it.
  • You want to achieve fast growth. “A/B testing is one of the easiest ways to double, triple, or 10X your results in a very short period of time,” Mara Beaman from Optimize Digital told MarketerHire. “High growth teams...are always testing.” 

But here’s the catch — you can’t test everything.

How to know if an A/B test is worth running.

Especially at startups, you don’t want to overwhelm your team. So it’s not worth running a test to find out if customers prefer a lavender header or a periwinkle one. 

The user base is typically small at early stage companies, so the results could take 90 days — or longer! — to bake out. And the result could easily be that it doesn’t really matter.

A good rule of thumb: “At a small company, when you’re doing a million other things, A/B testing should be [around] 15-20% of [a growth marketer’s] time, and probably not more,” Capland said.

Before developing a test, Capland recommends thinking about the following two questions: 

  • Are you testing something potentially permanent, like a sign-up form, a home page, or a product offering? 
  • Are you hoping to learn something that could impact other projects, other teams, your pricing, or your messaging?

If the answer is no to both, you’re probably just looking for a tiebreaker. That could be a short, low-lift A/B test — or, honestly, a game of rock paper scissors. 

If the answer is yes to either question, though, you want to be pretty confident that “what you’re learning is what you think you’re learning,” Capland said. 

That means you shouldn’t run a test unless you have enough time and user volume to achieve some level of statistical significance

Statistical significance in less than 200 words.

If you find yourself wondering, “Did the bigger button size that made more people convert when they saw Version A —  or was it chance?” you’re really asking “Was this result statistically significant?” 

Statistical significance is basically a measure of certainty about a test result. It grows as...

  • The performance gap between Version A and B increases, and/or...
  • Your sample size increases

You can never be sure a result wasn’t chance, but the probability that it was is ideally 5% or lower. (In other words: your test’s p-value is 0.05 or lower.) 

You shouldn’t let tests with weak statistical significance sway major business decisions. If you do, you’re still making an emotional decision — it’s just data-flavored. 

The ideal A/B testing approach.

You should run A/B tests regularly — and strategically. They’re particularly useful when…

  • You’re looking for a tiebreaker on a low-stakes decision
  • You’re looking for guidance on a major business decision, and you have the time and user volume to achieve statistically significant results

They can be overused, though. 

“Sometimes marketers use statistical significance as a crutch to prevent them from just making important decisions and going with their gut,” Capland said. 

“Sometimes marketers use statistical significance as a crutch to prevent them from just making important decisions and going with their gut."

When you’re deciding between an option that’s aligned with your brand values and one that isn’t, you don’t need to test to know what to do.

Starting an A/B test is as easy as 1, 2, 3

“A/B testing is a process, not a platform,” as growth marketer Aaron Patrick Doherty poetically put it to MarketerHire. 

It’s not unlike the scientific method — so it takes some critical thinking and technical know-how. 

Here are the steps your team should take to run a successful A/B test. (Spoiler alert: There are things to do before you even open up your testing platform!) 

1. Gather qualitative data. 

Capland starts his A/B testing process by “talking to people, doing some user testing, maybe even running a survey.” 

Qualitative data from these types of efforts can help you decide what to test. 

For instance, before people can unsubscribe from their SaaS product, Ahrefs asks them why they’re leaving. Common themes in responses to questions like this offer insights into the customer experience. 

If a lot of users think your product is confusing, test a new FAQs module. If a lot of users think your multipage intake form is too long, test a shorter one.

2. Brainstorm possible solutions and set priorities

It’s time for a team meeting. Start with a generative question, Capland suggested. For example: How could you solve a problem that surfaced in your qualitative data?

Using what you know about your product and your users, think through what kind of solutions would make the biggest impact. 

In his brainstorms, Capland emphasizes quantity over quality. He sometimes encourages his team to come up with 30 possible solutions. 

3. Start test development. 

Once you’ve decided on viable options to test, you get to open up your software and start building a test. 

Most tools these days are set up as no-code visual editors, which means marketers can set up tests without a developer’s help. Through a series of prompts, you simply define:

  • What you want to test. You might test two possible placements for a mobile app download link, or two different styles of photography on your home page. With visual editors, you simply drag and drop these components into a new spot for testing. 
  • What success looks like. What kind of customer or site-visitor behavior are you aiming for? This might be clicks, downloads, form-fills — you name it. 
  • How long the test will run. This is usually defined by the number of people you want to sample or the amount of time you want to spend testing — and whether you want to reach statistical significance or make a quick decision. 

Then, “[you] flip the test on and let it do its thing,” Capland said. 

Some general rules of thumb when A/B testing:

  • Be patient. “Don't expect instant gratification from any tool,” said John Paul Mains, chief scientist at Click Laboratory
  • Be ambitious. “What you learn over time is that small tests equal small results,” Capland said. If you focus on testing subtle tweaks, you’re less likely to see statistically significant results. 
  • Be open-minded. “Visitor responses to your tests are going to surprise you,” Mains said. 
  • Be realistic. No tool will automatically revolutionize your conversion rate optimization (CRO). “You'll still need human resources and a lot of time and investment to make optimization work,” said Alex Birkett, co-founder of Omniscient Digital.  

The 4 best native A/B testing tools

There are a lot of A/B testing tools on the market. A survey of growth marketers in our network revealed 25 different options — from native tools to standalone ones. 

For instance, a lot of growth marketers use HubSpot — an all-in-one CRM — to run email marketing A/B tests, and Unbounce or Instapage — end-to-end landing page builders — to A/B test conversions. 

Let’s start by focusing on native tools. 

If your team isn’t quite ready to invest in a standalone A/B testing tool, “there may be native [tools] built into your current marketing technologies that you can leverage to get the job done,” Beaman said. 

“There may be native [tools] built into your current marketing technologies that you can leverage to get the job done.”

In fact, in our survey, growth marketers named 18 email marketing, CRO, paid social, and other specialization-specific tools as their favorite A/B testing solutions. Some use native tools exclusively for website optimization or as email optimization platforms. They eschew standalone tools entirely. 

Here are a few native tools our growth marketer experts recommend: 

Unbounce for building landing pages.

Pricing: starts at $80 per month

Ideal use case: testing mobile optimization and targeted pop-ups while building out new web pages. Unbounce is popular among growth marketers— and recommended almost as much as the standalone tools below. 

“I feel [it] gets overlooked,” said marketing consultant Christian Betancourt, who called Unbounce “extremely cost-effective for brands.”

Facebook Ads for paid social.

Pricing: depends on campaign spending limit

Ideal use case: testing landing pages for social ads and trying out new segmentation strategies. “Facebook Ads has some A/B testing for campaigns and ad sets, so you could in theory test two landing pages,” Beaman said. 

HubSpot for email marketing.

Pricing: included with the free CRM tool

Ideal use case: testing subject lines on emails. “You can easily A/B test emails and subject lines with no dev required,” said growth marketer Kasey Bayne

Pro tip: If you want to go beyond the capabilities of HubSpot for A/B testing emails, MarketerHire’s lifecycle marketing manager Karina Vitale recommends Iterable. “Their testing tool has so many options,” she said — you can even use it to test email automations, unlike HubSpot.  

Yieldify for customer journey optimization.

Pricing: depends on traffic levels

Ideal use case: testing how overlays are performing on your website, and what their benefit is. Yieldify’s algorithm makes it easy to track whether the cost of a discount pop-up is worth the revenue, Glass said. 

The 3 best standalone A/B testing tools

The above tools are built into channel-specific platforms — but now let’s talk standalone tools, designed for A/B testing across all channels.

We polled growth marketers in our network to find out which A/B testing tools they recommend, and let them recommend as many (or few) as they wanted. We learned about growth marketers’ pet favorites, from AB Tasty to Adobe Target, Sitespect to Monetate, and Kameleoon to Omniconvert. But three tools stood out as the most frequently recommended. 

Here are the standalone tools the A/B testing pros recommended most often:

Source: MarketerHire


Let’s dig into the pricing, key features and ideal use cases for the three most-recommended softwares: VWO, Google Optimize, and Optimizely.

Google Optimize.


Source: Google

Pricing: free. (“Free is hard to beat,” as Mains said.)

Ideal for: a small team.

Biggest selling points:

  • Simplicity — Google Optimize “doesn't require any custom integrations,” Birkett said. “Just plug and play with Google Ads and Google Analytics.” 
  • Easy upgrades — If your team moves beyond what Google Optimize offers and needs things like audience optimization, multivariate testing or more full stack experiment functionality, upgrading to the paid version, Optimize 360, is pretty simple. 

Biggest downside: Google Optimize is fully free, but that means you can only run so many tests at a time. “When you’re ready to scale your testing program, typically there’s limitations there,” Capland said.

VWO.

Source: VWO

Pricing: free 30-day trial, then custom pricing, starting around $4,000 per year. 

Ideal for: brands that need more capabilities than Google Optimize. For instance, VWO makes it easy to change headlines and website copy directly from the software, growth marketer Kevin Shields at Market Your Customers told MarketerHire, while “Google Optimize is a bit more limited.” 

Biggest selling points: 

  • On-site and in-product testing — “Having the ability to run experiments in multiple areas of the funnel gives me more leverage,” Capland said. 
  • Streamlined scripts —  “Their platform is very fast,” said Mains, “and does not impact your overall site SEO with a lot of bloated scripts.” 
  • Qualitative data storage — VWO lets users store qualitative data as well as quantitative test results. As your team grows, having all that data in the same place is a time-saver. 

Biggest downside: “On VWO, things are organized weird,” Capland said. For example: A/B tests live in a different part of the tool than multivariate tests. Rummaging through a bunch of different folders for results from related but differently-structured tests is a time suck, not a time optimizer.

Optimizely.

Source: Optimizely

Pricing: starts around $60,000 annually, though they don’t post pricing on the site and did not respond to our request for comment. 

(“Optimizely is the best,” Capland said, but “they’re priced so outrageously enterprise that I haven’t used the tool in multiple years.”)

Ideal for: enterprise clients and brands that need lots of custom server-side and client-side tests made by developers. 

Biggest selling points: 

  • Stats engine Optimizely’s machine learning engine helps testers understand exactly when a test result becomes statistically significant. It can also prioritize the metrics you care about and track multiple metrics for each test, Capland said. Think monitoring both video plays and contact button clicks on the same landing page test, as Optimizely suggests.
Source: Optimizely via G2
  • Warp speed test integration — Optimizely says its frontend median load time is 50 milliseconds, “faster than the blink of an eye,” or basically in real-time. We couldn’t verify that, but Capland saids tests load and run faster than on any other platform. So you probably won’t get customer complaints about flicker. (More on that below.) 
  • Code base access Like other tools, Optimizely has a no-code solution. But if you’re trying to run complex or customized tests, you or developers on your team can easily access the code base.

Biggest downside: Optimizely isn’t designed specifically for marketers. It’s an all-purpose solution for developers, marketers and beyond. 

“If you’re a marketer you can make it work, but at their price point, there are much better solutions tailored to a marketing use case,” said demand generation manager Mike K. Tatum. (His favorite is VWO.) 

How the standalone A/B testing tools stack up

A lot of A/B testing solutions (including those we’ve named above) do the same thing. Let’s look at the key differences between the three most-recommended ones. 

VWO vs. Optimizely. 

Winner: VWO

The key differentiator: price

When Capland chose VWO for his company, Optimizely was the only other true contender. The decision came down to price, and VWO won out by a mile. 

Between the two, the testing capabilities and usability are very similar, but VWO is cheaper, Capland said.

Another advantage that sets VWO apart: growth marketers in our network said that VWO feels like it's specifically built for marketers, with a qualitative data recording feature that’s extra useful for growing marketing teams. You don’t have to speak CSS to make full use of VWO.  

VWO also offers some extra features that are particularly useful to marketers, like heatmaps and visitor recordings within the platform. Though Optimizely does have a heatmapping integration, the plugin requires configuring a third-party tool, Crazy Egg, to use.  

Optimizely isn’t bad — but it’s “not 10X better than other products like you'd think” based on the price, Birkett said. 

Google Optimize vs. VWO.

Winner: Google Optimize

The key differentiator: Price and ease of setup

If your company is just getting started with CRO and already uses Google Analytics or Google Tag Manager to measure site traffic and user behavior, Google Optimize is an intuitive way to get started with testing and increase conversions. 

Google’s tool offers simple A/B, multivariate and redirect test variations, as well as personalization tools — and most startups don’t need more than that, Swappie’s growth marketing manager Vlad Ihlinskyi said. 

“So many people trying to build their company go for expensive solutions,” said Michael Cahill from Cahill Consulting, “But if you don't know how to use them, you're just wasting money.”

“So many people trying to build their company go for expensive solutions. But if you don't know how to use them, you're just wasting money.”

Plus, if you’re just building your company and don’t have many users, you can’t run complex tests and get meaningful results.

Google Optimize is a great tool for standing up your testing process. VWO came up most frequently as the next, more advanced, step.

How to evaluate emerging A/B testing tools

Most of the existing tools operate similarly, but the ones our experts really recommend have a few non-negotiables.

If you go with a tool that’s not on our list, make sure it has the following required features (and the optional ones, if you need them!).

No-code capabilities.

Status: Required

All of the above tools without advanced coding skills. That’s important, since coding isn’t a required skill for growth marketers

A/B testing tools should all mention “visual editors” or “no-code capabilities” if they’re intended for marketers. 

Other testing solutions are built for developers and will take too much time and energy to configure. 

That said, when tests get super complex, the tool should allow some custom coding. For example, while it should be pretty low-lift to track installs on a thank-you page, building a test that tracks the LTV of customers who convert takes some customization — and that shouldn’t break your tool.  

A good fix for flicker. 

Status: Required

Flicker is, perhaps, a growth marketer’s biggest pet peeve. 

Let’s say you’re running a test on your product page. Sometimes, there’s a quick flicker. For just a moment, you see the A version of the page before the B version loads. 

“It happens quickly,” Capland said, “but not fast enough that people don’t see it, and it’s a huge frustration point.”

“[Flicker] happens quickly, but not fast enough that people don’t see it, and it’s a huge frustration point.”

Flicker reveals just a little too much of how the sausage is made for most growth marketers’ taste. 

But no matter what tool you use, flicker could pop up, Capland said. In fact, both Optimizely and VWO accuse each other of causing more flicker in their marketing materials. (VWO blames Optimizely’s synchronous code, and Optimizely blames VWO’s asynchronous code.)

Whatever the root cause, there’s usually a quick HTML fix for flicker, Capland said. But frequent complaints about flicker are something to watch for when choosing a tool.

Heatmapping, goal-tracking and more.

Status: Optional

Especially if you’re a one-person testing team, you’re probably on the lookout for user-friendly tools that will make your life simpler. As you’re shopping around, the experts we spoke to recommended a few bonus features that could simplify your tech stack. 

  • Heatmapping. Having heatmaps built into your A/B testing platform can eliminate the need for another tool, Capland said. 
  • High-level goal tracking. Some tools offer analytics as well as testing, and give you a sense of how your website converts “at the 10,000-foot view,” Capland said. 
  • User session recordings. This above-and-beyond qualitative data feature can help you understand how an individual user moved through your site. 

Test, test and test some more! 

There’s no one-size-fits-all A/B testing tool, but there is a one-size-fits-all tip for A/B testing: Try it! 

In competitive markets — or political races — it can give you the edge you need.

Just look at Barack Obama’s 2012 presidential campaign as an example. Obama for America, which has been credited with “revolutionizing” political marketing through their presidential campaigns, ran around 500 A/B tests before the election (using our friend Optimizely). 

Those A/B tests — on decisions as small as whether or not to include a dollar sign on donation forms — drove a 49% increase in donations. 

And we all know how well that turned out for Obama in 2012’s close race.

If you need guidance on which tools to use — or how to use them — rigorously-vetted growth marketers on MarketerHire’s platform can help. Talk to a marketer on our platform today

Kelsey DonkKelsey Donk
Kelsey Donk is a writer at MarketerHire. Before joining MarketerHire full-time, Kelsey was a freelance writer and loved working with small businesses to level up their content. When she isn't writing, Kelsey can be found gardening or walking her dogs all around Minneapolis.
Hire Marketers
Growth Marketing

The Best A/B Testing Tools on the Market, Ranked by Growth Marketing Experts

September 8, 2023
September 3, 2021
Kelsey Donk

A/B testing is a core skill growth marketers bring to a marketing team — but doing it well takes the right strategy and tools. We asked experts why and how they test.

Table of Contents

Let’s say you need to choose something major, like the pricing model for a new product, or something smaller, like the size of text on a mobile prompt

How do you make that business decision? 

  1. Talk it through with the team. 
  2. Look at what my competitors are up to.
  3. Test a few options and wait for results. 

A lot of companies go with option A.

“Most companies make decisions based on the loudest or most charismatic person in the room,”  Delivering Value founder Andrew Capland told MarketerHire. “And they don't always know the right answer.”

So, who does know the right answer? 

Your customers. 

“A/B testing is a great way to use the data and let your customers make the decision,” Capland said. 

How digital marketers use A/B tests

A/B testing isn’t unique to digital marketing. In fact, farmers developed what we now think of as A/B testing in the 1920s, to experiment with fertilization methods and crop placement. 

Since the 1990s, though, the term “A/B testing” has been associated with the digital realm, and especially with marketing. 

An A/B test, sometimes called a split test, is a randomized experiment. Basically, you have two versions of something — this could be a website layout, a button, a call to action — that we’ll call versions A and B. A group of randomly-selected users experience A, while another randomly selected group experiences B. 

Then you look at which group has a higher conversion rate. 

“The purpose of A/B testing is to figure out exactly what’s going to work for your audience,” growth marketer Katelyn Glass told MarketerHire. 

“The purpose of A/B testing is to figure out exactly what’s going to work for your audience."

It’s central to modern digital marketing. In fact, it’s what takes marketing out of “the Mad Men days” and into the present, Glass said. 

Back in the 1960s, “you would pay a million dollars to get your products in a magazine spread, but you would never really know the results,” she explained. 

It was hard to precisely measure offline conversions. But online, where more and more transactions take place, it’s a different story.

“Now, I can concretely go to a creative director and say, this ad campaign didn’t work, and I know that because [another campaign] worked a lot better,” Glass said. 

Nine times out of 10, in digital marketing, A/B or split testing means trying out different versions or templates on your creative, she noted. For instance, you might test a product page with one image against a product page with three images, and see which one leads to more sales.

Source: VWO

But you can A/B test nearly every aspect of digital marketing efforts — Facebook ad timing, email sender names, or user experience on the website. 

Ultimately, you’re optimizing for user behavior, Capland said. “A growth marketer is interested in doing A/B testing to drive more installs or sign-ups, or some number of people doing something.” 

“A growth marketer is interested in doing A/B testing to drive more installs or sign-ups, or some number of people doing something.” 

So how do you do that? With A/B testing software. It’s key to a digital marketer’s tech stack — and that’s especially true for growth marketers. To find out which tools are best, we reached out to the experts.

Meet the experts

  • Katelyn Glass, founder and managing partner of e-commerce and marketing agency Fifty Six, and former COO of Rowing Blazers
  • Andrew Capland, founder of Delivering Value and former head of growth at Postscript
  • 20+ growth marketers in MarketerHire’s network

When — and why — you should run A/B tests

At every stage of a company’s growth, A/B tests are valuable for validating business decisions that will impact multiple teams or workflows in the long term. 

In fact, our experts recommend making A/B testing a regular practice. 

“Consumer behaviors change, product mixes change, trends change — so your test results will, too,” said marketing consultant Rose Mayo. “Determine a good cadence for core tests and rerun them at least once a year.”

Here are a few key situations where you’ll know it’s time to A/B test:

  • Your conversion rates are low. If you’re getting a ton of traffic, but not a ton of installs, “that’s a great time to start doing some experimentation to figure out why,” Capland said.
  • You’re running the same creative on every social platform. “The creative you show on email [should be] different from your creative on paid social,” Glass said. But you have to test to find out which font size or photography style is right for each platform and target audience.
  • Your decision-making process stops at asking the founder what to do. You should stress test founders’ hot takes before putting budget behind them. 
  • You’re launching something new. “An A/B test is super valuable to validate things before you invest too deeply,” Capland said. Before you spend weeks in product development, test a minimum viable product to figure out if further investment is worth it.
  • You want to achieve fast growth. “A/B testing is one of the easiest ways to double, triple, or 10X your results in a very short period of time,” Mara Beaman from Optimize Digital told MarketerHire. “High growth teams...are always testing.” 

But here’s the catch — you can’t test everything.

How to know if an A/B test is worth running.

Especially at startups, you don’t want to overwhelm your team. So it’s not worth running a test to find out if customers prefer a lavender header or a periwinkle one. 

The user base is typically small at early stage companies, so the results could take 90 days — or longer! — to bake out. And the result could easily be that it doesn’t really matter.

A good rule of thumb: “At a small company, when you’re doing a million other things, A/B testing should be [around] 15-20% of [a growth marketer’s] time, and probably not more,” Capland said.

Before developing a test, Capland recommends thinking about the following two questions: 

  • Are you testing something potentially permanent, like a sign-up form, a home page, or a product offering? 
  • Are you hoping to learn something that could impact other projects, other teams, your pricing, or your messaging?

If the answer is no to both, you’re probably just looking for a tiebreaker. That could be a short, low-lift A/B test — or, honestly, a game of rock paper scissors. 

If the answer is yes to either question, though, you want to be pretty confident that “what you’re learning is what you think you’re learning,” Capland said. 

That means you shouldn’t run a test unless you have enough time and user volume to achieve some level of statistical significance

Statistical significance in less than 200 words.

If you find yourself wondering, “Did the bigger button size that made more people convert when they saw Version A —  or was it chance?” you’re really asking “Was this result statistically significant?” 

Statistical significance is basically a measure of certainty about a test result. It grows as...

  • The performance gap between Version A and B increases, and/or...
  • Your sample size increases

You can never be sure a result wasn’t chance, but the probability that it was is ideally 5% or lower. (In other words: your test’s p-value is 0.05 or lower.) 

You shouldn’t let tests with weak statistical significance sway major business decisions. If you do, you’re still making an emotional decision — it’s just data-flavored. 

The ideal A/B testing approach.

You should run A/B tests regularly — and strategically. They’re particularly useful when…

  • You’re looking for a tiebreaker on a low-stakes decision
  • You’re looking for guidance on a major business decision, and you have the time and user volume to achieve statistically significant results

They can be overused, though. 

“Sometimes marketers use statistical significance as a crutch to prevent them from just making important decisions and going with their gut,” Capland said. 

“Sometimes marketers use statistical significance as a crutch to prevent them from just making important decisions and going with their gut."

When you’re deciding between an option that’s aligned with your brand values and one that isn’t, you don’t need to test to know what to do.

Starting an A/B test is as easy as 1, 2, 3

“A/B testing is a process, not a platform,” as growth marketer Aaron Patrick Doherty poetically put it to MarketerHire. 

It’s not unlike the scientific method — so it takes some critical thinking and technical know-how. 

Here are the steps your team should take to run a successful A/B test. (Spoiler alert: There are things to do before you even open up your testing platform!) 

1. Gather qualitative data. 

Capland starts his A/B testing process by “talking to people, doing some user testing, maybe even running a survey.” 

Qualitative data from these types of efforts can help you decide what to test. 

For instance, before people can unsubscribe from their SaaS product, Ahrefs asks them why they’re leaving. Common themes in responses to questions like this offer insights into the customer experience. 

If a lot of users think your product is confusing, test a new FAQs module. If a lot of users think your multipage intake form is too long, test a shorter one.

2. Brainstorm possible solutions and set priorities

It’s time for a team meeting. Start with a generative question, Capland suggested. For example: How could you solve a problem that surfaced in your qualitative data?

Using what you know about your product and your users, think through what kind of solutions would make the biggest impact. 

In his brainstorms, Capland emphasizes quantity over quality. He sometimes encourages his team to come up with 30 possible solutions. 

3. Start test development. 

Once you’ve decided on viable options to test, you get to open up your software and start building a test. 

Most tools these days are set up as no-code visual editors, which means marketers can set up tests without a developer’s help. Through a series of prompts, you simply define:

  • What you want to test. You might test two possible placements for a mobile app download link, or two different styles of photography on your home page. With visual editors, you simply drag and drop these components into a new spot for testing. 
  • What success looks like. What kind of customer or site-visitor behavior are you aiming for? This might be clicks, downloads, form-fills — you name it. 
  • How long the test will run. This is usually defined by the number of people you want to sample or the amount of time you want to spend testing — and whether you want to reach statistical significance or make a quick decision. 

Then, “[you] flip the test on and let it do its thing,” Capland said. 

Some general rules of thumb when A/B testing:

  • Be patient. “Don't expect instant gratification from any tool,” said John Paul Mains, chief scientist at Click Laboratory
  • Be ambitious. “What you learn over time is that small tests equal small results,” Capland said. If you focus on testing subtle tweaks, you’re less likely to see statistically significant results. 
  • Be open-minded. “Visitor responses to your tests are going to surprise you,” Mains said. 
  • Be realistic. No tool will automatically revolutionize your conversion rate optimization (CRO). “You'll still need human resources and a lot of time and investment to make optimization work,” said Alex Birkett, co-founder of Omniscient Digital.  

The 4 best native A/B testing tools

There are a lot of A/B testing tools on the market. A survey of growth marketers in our network revealed 25 different options — from native tools to standalone ones. 

For instance, a lot of growth marketers use HubSpot — an all-in-one CRM — to run email marketing A/B tests, and Unbounce or Instapage — end-to-end landing page builders — to A/B test conversions. 

Let’s start by focusing on native tools. 

If your team isn’t quite ready to invest in a standalone A/B testing tool, “there may be native [tools] built into your current marketing technologies that you can leverage to get the job done,” Beaman said. 

“There may be native [tools] built into your current marketing technologies that you can leverage to get the job done.”

In fact, in our survey, growth marketers named 18 email marketing, CRO, paid social, and other specialization-specific tools as their favorite A/B testing solutions. Some use native tools exclusively for website optimization or as email optimization platforms. They eschew standalone tools entirely. 

Here are a few native tools our growth marketer experts recommend: 

Unbounce for building landing pages.

Pricing: starts at $80 per month

Ideal use case: testing mobile optimization and targeted pop-ups while building out new web pages. Unbounce is popular among growth marketers— and recommended almost as much as the standalone tools below. 

“I feel [it] gets overlooked,” said marketing consultant Christian Betancourt, who called Unbounce “extremely cost-effective for brands.”

Facebook Ads for paid social.

Pricing: depends on campaign spending limit

Ideal use case: testing landing pages for social ads and trying out new segmentation strategies. “Facebook Ads has some A/B testing for campaigns and ad sets, so you could in theory test two landing pages,” Beaman said. 

HubSpot for email marketing.

Pricing: included with the free CRM tool

Ideal use case: testing subject lines on emails. “You can easily A/B test emails and subject lines with no dev required,” said growth marketer Kasey Bayne

Pro tip: If you want to go beyond the capabilities of HubSpot for A/B testing emails, MarketerHire’s lifecycle marketing manager Karina Vitale recommends Iterable. “Their testing tool has so many options,” she said — you can even use it to test email automations, unlike HubSpot.  

Yieldify for customer journey optimization.

Pricing: depends on traffic levels

Ideal use case: testing how overlays are performing on your website, and what their benefit is. Yieldify’s algorithm makes it easy to track whether the cost of a discount pop-up is worth the revenue, Glass said. 

The 3 best standalone A/B testing tools

The above tools are built into channel-specific platforms — but now let’s talk standalone tools, designed for A/B testing across all channels.

We polled growth marketers in our network to find out which A/B testing tools they recommend, and let them recommend as many (or few) as they wanted. We learned about growth marketers’ pet favorites, from AB Tasty to Adobe Target, Sitespect to Monetate, and Kameleoon to Omniconvert. But three tools stood out as the most frequently recommended. 

Here are the standalone tools the A/B testing pros recommended most often:

Source: MarketerHire


Let’s dig into the pricing, key features and ideal use cases for the three most-recommended softwares: VWO, Google Optimize, and Optimizely.

Google Optimize.


Source: Google

Pricing: free. (“Free is hard to beat,” as Mains said.)

Ideal for: a small team.

Biggest selling points:

  • Simplicity — Google Optimize “doesn't require any custom integrations,” Birkett said. “Just plug and play with Google Ads and Google Analytics.” 
  • Easy upgrades — If your team moves beyond what Google Optimize offers and needs things like audience optimization, multivariate testing or more full stack experiment functionality, upgrading to the paid version, Optimize 360, is pretty simple. 

Biggest downside: Google Optimize is fully free, but that means you can only run so many tests at a time. “When you’re ready to scale your testing program, typically there’s limitations there,” Capland said.

VWO.

Source: VWO

Pricing: free 30-day trial, then custom pricing, starting around $4,000 per year. 

Ideal for: brands that need more capabilities than Google Optimize. For instance, VWO makes it easy to change headlines and website copy directly from the software, growth marketer Kevin Shields at Market Your Customers told MarketerHire, while “Google Optimize is a bit more limited.” 

Biggest selling points: 

  • On-site and in-product testing — “Having the ability to run experiments in multiple areas of the funnel gives me more leverage,” Capland said. 
  • Streamlined scripts —  “Their platform is very fast,” said Mains, “and does not impact your overall site SEO with a lot of bloated scripts.” 
  • Qualitative data storage — VWO lets users store qualitative data as well as quantitative test results. As your team grows, having all that data in the same place is a time-saver. 

Biggest downside: “On VWO, things are organized weird,” Capland said. For example: A/B tests live in a different part of the tool than multivariate tests. Rummaging through a bunch of different folders for results from related but differently-structured tests is a time suck, not a time optimizer.

Optimizely.

Source: Optimizely

Pricing: starts around $60,000 annually, though they don’t post pricing on the site and did not respond to our request for comment. 

(“Optimizely is the best,” Capland said, but “they’re priced so outrageously enterprise that I haven’t used the tool in multiple years.”)

Ideal for: enterprise clients and brands that need lots of custom server-side and client-side tests made by developers. 

Biggest selling points: 

  • Stats engine Optimizely’s machine learning engine helps testers understand exactly when a test result becomes statistically significant. It can also prioritize the metrics you care about and track multiple metrics for each test, Capland said. Think monitoring both video plays and contact button clicks on the same landing page test, as Optimizely suggests.
Source: Optimizely via G2
  • Warp speed test integration — Optimizely says its frontend median load time is 50 milliseconds, “faster than the blink of an eye,” or basically in real-time. We couldn’t verify that, but Capland saids tests load and run faster than on any other platform. So you probably won’t get customer complaints about flicker. (More on that below.) 
  • Code base access Like other tools, Optimizely has a no-code solution. But if you’re trying to run complex or customized tests, you or developers on your team can easily access the code base.

Biggest downside: Optimizely isn’t designed specifically for marketers. It’s an all-purpose solution for developers, marketers and beyond. 

“If you’re a marketer you can make it work, but at their price point, there are much better solutions tailored to a marketing use case,” said demand generation manager Mike K. Tatum. (His favorite is VWO.) 

How the standalone A/B testing tools stack up

A lot of A/B testing solutions (including those we’ve named above) do the same thing. Let’s look at the key differences between the three most-recommended ones. 

VWO vs. Optimizely. 

Winner: VWO

The key differentiator: price

When Capland chose VWO for his company, Optimizely was the only other true contender. The decision came down to price, and VWO won out by a mile. 

Between the two, the testing capabilities and usability are very similar, but VWO is cheaper, Capland said.

Another advantage that sets VWO apart: growth marketers in our network said that VWO feels like it's specifically built for marketers, with a qualitative data recording feature that’s extra useful for growing marketing teams. You don’t have to speak CSS to make full use of VWO.  

VWO also offers some extra features that are particularly useful to marketers, like heatmaps and visitor recordings within the platform. Though Optimizely does have a heatmapping integration, the plugin requires configuring a third-party tool, Crazy Egg, to use.  

Optimizely isn’t bad — but it’s “not 10X better than other products like you'd think” based on the price, Birkett said. 

Google Optimize vs. VWO.

Winner: Google Optimize

The key differentiator: Price and ease of setup

If your company is just getting started with CRO and already uses Google Analytics or Google Tag Manager to measure site traffic and user behavior, Google Optimize is an intuitive way to get started with testing and increase conversions. 

Google’s tool offers simple A/B, multivariate and redirect test variations, as well as personalization tools — and most startups don’t need more than that, Swappie’s growth marketing manager Vlad Ihlinskyi said. 

“So many people trying to build their company go for expensive solutions,” said Michael Cahill from Cahill Consulting, “But if you don't know how to use them, you're just wasting money.”

“So many people trying to build their company go for expensive solutions. But if you don't know how to use them, you're just wasting money.”

Plus, if you’re just building your company and don’t have many users, you can’t run complex tests and get meaningful results.

Google Optimize is a great tool for standing up your testing process. VWO came up most frequently as the next, more advanced, step.

How to evaluate emerging A/B testing tools

Most of the existing tools operate similarly, but the ones our experts really recommend have a few non-negotiables.

If you go with a tool that’s not on our list, make sure it has the following required features (and the optional ones, if you need them!).

No-code capabilities.

Status: Required

All of the above tools without advanced coding skills. That’s important, since coding isn’t a required skill for growth marketers

A/B testing tools should all mention “visual editors” or “no-code capabilities” if they’re intended for marketers. 

Other testing solutions are built for developers and will take too much time and energy to configure. 

That said, when tests get super complex, the tool should allow some custom coding. For example, while it should be pretty low-lift to track installs on a thank-you page, building a test that tracks the LTV of customers who convert takes some customization — and that shouldn’t break your tool.  

A good fix for flicker. 

Status: Required

Flicker is, perhaps, a growth marketer’s biggest pet peeve. 

Let’s say you’re running a test on your product page. Sometimes, there’s a quick flicker. For just a moment, you see the A version of the page before the B version loads. 

“It happens quickly,” Capland said, “but not fast enough that people don’t see it, and it’s a huge frustration point.”

“[Flicker] happens quickly, but not fast enough that people don’t see it, and it’s a huge frustration point.”

Flicker reveals just a little too much of how the sausage is made for most growth marketers’ taste. 

But no matter what tool you use, flicker could pop up, Capland said. In fact, both Optimizely and VWO accuse each other of causing more flicker in their marketing materials. (VWO blames Optimizely’s synchronous code, and Optimizely blames VWO’s asynchronous code.)

Whatever the root cause, there’s usually a quick HTML fix for flicker, Capland said. But frequent complaints about flicker are something to watch for when choosing a tool.

Heatmapping, goal-tracking and more.

Status: Optional

Especially if you’re a one-person testing team, you’re probably on the lookout for user-friendly tools that will make your life simpler. As you’re shopping around, the experts we spoke to recommended a few bonus features that could simplify your tech stack. 

  • Heatmapping. Having heatmaps built into your A/B testing platform can eliminate the need for another tool, Capland said. 
  • High-level goal tracking. Some tools offer analytics as well as testing, and give you a sense of how your website converts “at the 10,000-foot view,” Capland said. 
  • User session recordings. This above-and-beyond qualitative data feature can help you understand how an individual user moved through your site. 

Test, test and test some more! 

There’s no one-size-fits-all A/B testing tool, but there is a one-size-fits-all tip for A/B testing: Try it! 

In competitive markets — or political races — it can give you the edge you need.

Just look at Barack Obama’s 2012 presidential campaign as an example. Obama for America, which has been credited with “revolutionizing” political marketing through their presidential campaigns, ran around 500 A/B tests before the election (using our friend Optimizely). 

Those A/B tests — on decisions as small as whether or not to include a dollar sign on donation forms — drove a 49% increase in donations. 

And we all know how well that turned out for Obama in 2012’s close race.

If you need guidance on which tools to use — or how to use them — rigorously-vetted growth marketers on MarketerHire’s platform can help. Talk to a marketer on our platform today

Kelsey Donk
about the author

Kelsey Donk is a writer at MarketerHire. Before joining MarketerHire full-time, Kelsey was a freelance writer and loved working with small businesses to level up their content. When she isn't writing, Kelsey can be found gardening or walking her dogs all around Minneapolis.

Hire a Marketer