The best tools for A/B testing

If you monitor websites like Workana or Indeed and check out skills requirements for Digital Marketers, you’ll see that experience in A/B testing is what most employers seek. Indeed, the idea of A/B experiments is more than tempting. Put forward an optimization hypothesis, test it, and scale the best results. However, only 1 of 8 conducted tests show any signs of real process breakthrough. To find that needle in a haystack, both the preparatory stage and the A/B testing tools need to be a result of careful consideration.

Importance of A/B testing

Say you execute email marketing campaigns for an online garments store. You use well-performing templates for various occasions. For example, when your new merch arrives, you need your eCommerce ready with new categories and the visual elements that go with them. Whether a mid-season sale or holiday giveaways, we need some change in the way products are presented. Then, at some moment and with sound arguments in mind, you assume that campaigns’ CTR may increase if you change the disposition of the “Sale” sticker.

Testing an impact of different visuals on a click-through-rate

“Testing an impact of different visuals on a click-through-rate”, source

The hypothesis may occur from previous experience, successful competitors’ benchmarks, or - general best practices. Anyway, this remains subjective - no one can guarantee how predictable the outcome is. Before scaling out the process, you should prove that the A/B experiment doesn’t negatively affect the campaign’s ROI.
The mathematically accurate way to lessen failure risks is to examine changes in small groups, and in case of success - extrapolate these to a broader audience, divided into two. The two segments are 1) the principal “A” segment and 2) the experimental “B” segment.

A B-segment will receive a modified version of something, in our example - a newsletter with a changed label location, while A-segment will continue to get an original version.

Track CTR metrics for both segments, and after the experiment ends up - compare overall average results.

Compare overall average results of A/B Tests

“The graph shows a distribution of individual values from “A” and “B” segments around the average”, source

To assume if the result is reliable, meaning if the difference in CTRs is genuinely considerable, you need to measure the intersection area with 90%, 95%, or 99% statistical significance. 

Relevant results of A/B Tests

“The less the intersection area is, the more relevant results are.”, source

Neglecting mathematical significance can lead to misinterpretation of A/B test results. You should also take into account the absolute size of each segment, because data collected from 100 or 10,000 people may vary significantly.

A/B testing tools

We examine the outcome of the tests in regards to three principal concerns:

  • changes on conversions
  • value of business metrics, such as the average size of a shopping cart
  • behavioral factors, such as average sessions’ duration, websites’ interaction depth, event-based clicks, etc.

Depending on the purpose, individual experiments change considerably. Hence, there’s no all-purpose formula on how to conduct A/B tests effectively. To increase your success chances, we’ve gathered a list of the most helpful tools that are indeed worth trying.

Google Optimize

Googel Optimize

Google Optimize is a free tool that allows tests for a website’s UI/UX on both desktop and mobile versions. You can conduct all sorts of A/B tests for landing pages, in particular:

  • add or hide elements, such as pieces of content, images, buttons
  • establish and compare different layouts
  • probe various web designs

To launch a test, add Google Analytics, Google Tag Manager, and Optimize tracking codes on your target website. We highly recommend also setting up goals and events tracking. After you create your account, choose between four regular experiments types:

  • A/B test - to display from 2 to “n” variants of the same web page
  • Multivariate test - to show two or more elements within a particular web page
  • Redirect test - to display different pages’ design
  • Personalization – for customized landings

You will have to develop alternatives that later you can use to compare with the original. For example, you can see how changes affect a change in colors, distinct URLs creation, content manipulations, or edit HTML and external JavaScript.
To target the audience, use one of the ten targeting options: Audience, Behaviour, GEO, Technology, and others. A cool thing about Optimize is that you can establish weights for different experiments - to manage traffic and learn to interpret users’ reactions better.

Pros: simple interface, free to use, easily synchronizable with other web-analytics programs from Google.
Cons: not decent for emails’ content A/B testing.
Cost: free.

Snov.io

Snov.io

Email marketers will enjoy the ease of launching drip campaigns and experimenting with differing messages. Instead of planning email delivery and analyzing results manually, create personalized drip email sequences by setting important triggers and goals.

Delivery trees and different email content effectiveness

“Build email delivery trees and test different email content effectiveness with the help of Snov.io”, source

The elimination of the influence of external factors leads to more reliable A/B test results. One of these is the time. Thus, both “A” and “B” segments’ reactions shall necessarily be measured simultaneously.
Branch recipients into several groups and schedule the delivery. Then, create a distinct drip logic for every group: develop different messages, designs, or visuals. To deepen the value of the A/B experiment, try intensifying the scheme with triggers. For example, plan different scenarios for users who opened the email right away and those who didn’t open it at all.
You can track a campaign’s performance all the way through, and after it’s finished - check out the analytical part for insights about performance improvement.

Pros: emails’ A/B testing, email finder & email verification features, automated emails’ scheduling, integration via REST API.
Cons: can’t be used for websites’ A/B experiments.
Cost: subscription plan starts from $33 per month; unlimited recipients and users and 100,000 credits will cost $482 per month.

Optimizely

Optimizely

The service is devoted to A/B tests, indeed. It supports integration with Google Analytics and gives endless opportunities for multivariate experiments.
Optimizely will be convenient if you want to conduct tests with images. It offers visuals hosting and editor. The service allows segmentation and GEO targeting, as well.
Analysis enthusiasts will be satisfied with tremendous opportunities the tool offers:

  1. customized reports
  2. statistics
  3. heatmaps’ integration
  4. traffic distribution
  5. and more

Pros: conversions and interaction time tracking, audience segmentation, targeting
Cons: doesn’t allow sources’ and keywords tracking
Cost: both free and paid subscription plans are available

Adobe Target

Adobe Target

Adobe Target is a cloud solution for multi-channel A/B tests. Its creators point that isolated experiments may not be as meaningful as those conducted using different touchpoints with target customers. Testing via multiple dimensions at once appears nearly impossible, but with the Adobe Experience Cloud, it’s not that difficult.
First, you need to initialize a new test and add the destination webpage URL. An easy-to-use visual editor will open so that you may modify sections’ repositioning or edit content. You can customize experiments’ terms as much as possible by adding personalized rules.

Add experiences to your tests or create these from scratch

“Add past experiences to your tests or create these from scratch”, source

The library allows working with audiences from previous tests instead of creating these from scratch every time. The idea behind using past test data is that you can share audiences from and to an analytics account instead of having to transfer information manually.
The dashboard displays completion statistics in the form of charts and tables with numeric data. These, in turn, can be shared with either teammates or executives. If you took time and delimited particular conversion elements, you would see their relative impact on the ultimate outcome.

Pros: omnichannel tests, AI-enhanced automation opportunities, advanced tests’ customization options
Cons: not suitable for email tests
Cost: has two packages - Adobe Target Standard and Adobe Target Premium. The service offers flexible licensing, so for exact prices, one shall contact a sales team.

VWO

VWO

The software lets you manage A/B, multivariate, and URLs testing. It has a pretty handy interface, in particular, to make various sorts of modifications to web pages. What’s interesting about VWO is that you can run tests with pre-designed widget templates — primarily relevant with small budgets or A/B newbies.
The program is also helpful to decompose hypotheses and extract elements that had the highest impact. Therefore, you can refer to previous results at any moment and always consider them in future tests and test results.

Test component performance

“See how different components of the test have performed”, source

Another cool thing is that you can check out what audiences see in a real-time mode on all devices and web browsers. The real-time data helps you react to a possible problem with incredible speed.

Pros: entire customer’s journey building tool, SmartStats to interpret results, heatmaps, and screen recordings.
Cons: for landing pages’ A/B test only.
Cost: paid subscription starts from $199 per month; however, a free trial option is available, as well

Apptimize

Apptimize

Like Adobe Target, this is a cross-platform solution for A/B tests. It is equipped with the must-have-features of a “self-respecting” service:

  1. segmenting and targeting filters
  2. versions’ editing tools
  3. cross-channel changes’ synchronization.

Apptimize promotes an integrated approach to any changes implementation via the flagging option. Consequently, experimenters can be sure users see the corresponding information - on all company’s online resources they attend.

Pros: feature flags, cross-platform changes’ control system
Cons: relatively fewer experiments adjusting options, applicable for websites only
Cost: two paid subscriptions, Advanced and Enterprise plans; users can start with the “Standard” plan for free

Kameleoon

Kameleoon

This service declares three quality hallmarks:

  1. built-in KPIs and Segment tools - for better teamwork
  2. preventing Flash of Original Content effect - through Kameleoon unique technology

WYSIWYG editing tool for experiments within varying coding environments. A helpful option is computerized traffic distribution: as soon as the “winning” version emerges, the system forwards the remaining traffic onto that option automatically.

Pros: dynamic traffic distribution, QA simulators, more than 25 segmentation criteria, 6 interface languages
Cons: for landing pages only
Cost: not provided

Conclusion

A/B testing is such a comprehensive and thorough process that the word ‘test’ may mislead some into consider it nothing beyond the few simple steps required in any given simple experiment. After all, conversion rates are not easy to improve, and there is hardly any better method. Metrics like reduced bounce rates or time spent are some of the factors involved in the success rate of any eCommerce but you also use them to verify the quality of your customer engagement. However, with consistent quality content, A/B testing can reveal the best way for meaningful content to reach an appreciative audience.

A/B testing is strictly tailor-made, even though its experimental variables are simple and easy to change. Given the complexity of A/B testing, data gathering is the easier part. Apart from all the analytical data, we need a trusty set of tools and gadgets. We hope our selection above does benefit your ongoing strive for innovation.
This article touches on only some of the essential tips for A/B test planning and performance. In the end, a piece of software is nothing but a dormant executor of routine tasks. A machine can do accurate calculations and provide you with vivid reports and tables. And with methods like no-code automation, even non-developers can run and manage test environment.
However, whatever the nature of your test, its most valuable part is the deep analysis done by humans. According to this rule, and despite the undeniable power of AI, A/B testing makes no exception.

Author

Stefani Stefanova | QA Engineer

Stefani loves bugs. She is a dedicate Quality Assurance Engineer with a will to not let any flaws through her tight spider net. When she tests, there is no good or evil, or the slightest chance of missing out. She discovers software flaws at places where you wouldn't expect them to be.