A/B Test for Success

Most marketers today understand the value of conducting an A/B test, but many marketers don’t know how to get optimal value out of their test. Either they A/B test everything and too often. Or, they call their A/B tests too early, rendering the results insignificant. But worst is when they don’t know what business question they are trying to answer or goal they have in mind to achieve from the test.

Here I am going to present you with a recent A/B test (client details removed) to demonstrate the process of carrying out an effective A/B test with striking results!

I Stand With Israel Bumper Sticker Campaign

The challenge: A once very successful snail mail address lead acquisition campaign was plummeting fast. If in the beginning we were achieving an average of 100 leads per month for our client, between December 2017 and February 2018, this number had dropped to closer to 30.

Results December to February

Source / Medium Sessions Bumper Sticker Orders Conversion Rate
Third-Party List 1 903 39 4.32%
Client Facebook 4,426 23 .52%
I365 Daily Inspiration Email 360 8 2.22%
Client Website Pop-Up / Pounder Ad 87 8 9.2%
Breaking Israel News Pop-Up / Pounder Ad 272 4 1.47%
Direct Hits 46 3 6.52%
Google (Organic) 24 3 12.5%
Total 6,715 92 1.37%

 

As you can see from the above chart, the average conversion rate was 1.37%, with 92 bumper stickers ordered over a nearly 3-month period.

The marketing team determined two factors that could be leading to decreased conversion rates.

First, stale creative. The client had been using the same creative for the entire campaign and it was getting old. Plus, it was very “traditional Jewish” creative, and we knew we needed to reach a more diverse audience.

Second, the “free” factor. Ads told users that the bumper stickers were free, when in fact there was a $5 shipping and handling fee associated with each order. Leads were costing so much, that the $5 gained was negligible. We wondered if offering the bumper stickers for “truly free” would encourage users to make an order.

In March, we decided to run two simultaneous A/B tests.

First, we incorporated a new creative package. For this A/B test, we developed a new ad set and changed out the images on the old landing page to relate to the new ad set.

Second, we created a new “truly free” landing page. Users still arrived at the new page through the old ad set and the images on the page mirrored the old landing page.

This his how our results looked in March:

From Views Conversions Conversion Rate
Original 438 10 2.3%
New Images 1,816 82 4.5%
Truly Free 454 67 14.8%

 

What we learned is that many more people clicked on the new creative ad set, leading to more page-views and the highest number of conversions. However, the conversion rate only slightly increased on that page. The highest conversion rate was on the “truly free” page.

The Opportunity: The new creative was outperforming the old creative by a landslide (1,816 views verses in the 400s). The “truly free” page was outperforming the old page by more than 10 percent. Hence, we surmised that if we paired the new creative with the “truly free” landing page, we would see even greater success.

And we were right. The number of leads has skyrocketed and the cost per lead has dropped by more than 95%.

In February (old creative, old landing page), we had only 11 leads in the whole month and each lead cost an average of $50.42.

In March (new creative to old landing page concept and old creative to new landing page concept), we got 32 leads at an average of $10.79 per lead.

In the first week of April (new creative to new landing page), we got 79 leads – yes, in one week! – at a cost of $2.38 per lead.

Of course, there are other factors, like how much time we spend optimizing our audiences and building new targeted lists, etc. But what you can see is a methodical, strategic A/B test can provide the data you need for success.

By Maayan Hoffman

By |2018-04-30T10:47:36+00:00April 29th, 2018|Uncategorized|0 Comments

About the Author:

Leave A Comment