A/B testing, also known as split testing, has long been established in digital marketing to optimize web pages and elements therein.
Rather than using intuition or personal preference, A/B testing changed the playing field by allowing digital marketers and to scientifically evaluate conversion data for maximum ROI.
However, while digital marketers have rapidly adopted A/B testing tools with relative success, many publishers, particularly those in the news business, seem averse to automated optimization technologies.
This article discusses the fundamentals of A/B testing, the benefits of A/B testing for Publishers, and how to address some of the related challenges.
Table of content:
What is A/B Testing?
Essentially, A/B testing is the process of testing two (or more) variations of the same page in order to define which elements of the page are more effective in driving conversions.
A/B testing usually operates during a limited, pre-defined period, where users are allocated in equal numbers to Version A and Version B. Once this period of time is over, a winner is declared, and 100% of the audience is then sent to the winning variation.
What can Publishers do with A/B Testing Tools?
The are two primary ways that publishers can use A/B testing to benefit their site.
The first is similar to how digital marketers use A/B testing - by creating ad content with a higher CTR than other variations.
The second way it can be used is to test which content is deemed newsworthy by its audience.
Ultimately both reasons lead to the same end goal, increased Revenue Per Mille (RPM)
A/B testing and multi-armed bandit solutions can and arguably should be used by publishers to optimize the ads appearing on their websites.
Elements that can be tested include the ad location, size, and color. Publishers can even test different types of ad creative- images, text, or video- to determine what is more appealing to users.
To Improve Content Quality
Publishers can do all the ad testing in the world, but without quality content and an exemplary user experience driving the number of visitors to the site, all this time and energy will be wasted. This is where content testing comes into play. Publishers can use testing tools to determine which content type, amount, and delivery methods work for various audiences.
When to A/B Test
The best time to A/B test is often in January, a time when, due to the holiday season, digital ad spend is down after the festive rush. Depending on the industry, some publishers may find January is a busier month than other times throughout the year, EOFY, for example.
Regardless, the best time to conduct A/B testing is the month or quarter when marketing spend is lowest for the year. While A/B testing, there is the inevitable likelihood that the website will lose some money on ad spend while publishers analyze the data.
Therefore it is essential to minimize this loss. While there may be a certain decline in RPM while testing is being conducted on the page, A/B testing, when executed correctly, can a statistically significant increase in advertising ROI long term.
Challenges Facing Publishers in A/B Testing
Despite case studies showing that A/B testing is essential to improve user experience and increase RPM, many publishers have yet to adopt this testing tool for page optimization.
While there are specific challenges facing publishing sites in A/B testing, many of these can be overcome.
The software most commonly used to A/B test is often designed for digital marketers, making it difficult for website owners to use. Here are some of the limitations that are encountered.
Inability to Track Ad Clicks
For sites working with an ad network, they are unable to track ad clicks. This is due to two primary reasons:
- Most ad networks include the creatives in the form of an iframe that does not support tracking ad clicks.
- Many ad networks have a program policy that prohibits the use of analytics or software to measure ad clicks directly.
No Support for Automatically Creating Variations
Typically news websites will want to show three Ad units on a page.
There are typically six or seven options for the ad placement, two to three size options for each of these locations, and a further five or six options for the ad’s color scheme. When considering all these variables, the test options for each ad spot can easily number into the 100s. Currently, very few software options allow users to create these variations automatically, making it arduous and time-consuming.
Smaller websites, however, could cut down the number of test options for each page to save themselves time and money.
There is no doubt that publishing sites are a very different beast than ecommerce pages. The goal of content sites is not to move inventory but rather to capture visitors’ attention.
It is only through curating content that drives traffic that news sites can monetize their content. Typically, the decisions around what makes a newsworthy piece of content have been placed solely in the hands of the editors, whose role it is to select, organize, and maintain engaging, relevant, and newsworthy articles.
The mere idea of incorporating automation into this role has created tension with journalists across the globe, who have seen such an implementation as a threat to both their jobs and editorial integrity.
When publishers A/B test, rather than evaluating content quality, they test headlines and images. This had led to a rise in stories featuring clickbait-type news over in-depth investigative pieces, a further point of contention for editors wanting to maintain journalistic standards.
While these concerns are understandable, they are possibly outdated and are holding sites back from maximizing their potential.
When used in tandem with human decision-making, insights arising from testing A/B can improve site content based on user behavior.
As we can see above, there are several considerations for websites looking to adopt A/B testing, particularly from an editorial standpoint. It is really all about striking a balance between what interests readers and quality content.
Towards Multi-Armed Bandit Solutions
As we move into an increasingly automated digital space, multi-armed bandit solutions may be the key to continued content optimization.
Multi-armed bandit solutions are a more sophisticated form of A/B testing that uses machine learning algorithms to dynamically allocate more traffic to the variation of a site that is performing better than its counterpart, which conversely is sent less traffic.
While multi-armed banded solutions are more computationally complex than a/b tests, they work in real-time, making them potentially faster and more cost-effective.
With so many external factors impacting how publishers create and prioritize content, utilizing A/B or multi-bandit testing to provide a superior user experience and drive ROI is a significant way for sites to get the jump on their competitors.
If you’re making more than $2,000 in monthly ad revenue, contact us today to learn more about how Publift can help increase your ad revenue and best optimize the ad space available on your website or app.