A/B Testing: You Will Know the Data and the Data Will Set You Free!

A major piece of website optimization and effective inbound marketing is the implementation of A/B testing. This glorious and ingenious process of improvement can make quite the difference in generating conversions. It certainly did with the 2012 Obama campaign. Of the $690 million raised to support the campaign, most of it came from donations generated through emails that endured extensive A/B testing. In fact, an optimized variant of the email’s subject line made the campaign an estimated $2.6 million. Think about that for a second… One phrase meant generating more money than I’ll probably ever see in my lifetime. Hmmmm… I wonder if maybe we should take this process seriously?

What is this sorcery!?

Simply put, A/B testing is the process of testing different variants of web pages or digital mediums to see which variants perform the best at generating conversions. These variants could be anything from the layout of page content, to the color of the “subscribe” button. Testing works by showing separate visitors separate versions of a website, each containing a different variant of a particular aspect of that site. An example would be that one site might contain a serif font, while the other contains a sans serif font. The variation (or font, in this case) on the site that generates the most conversions wins! It can be inferred, if the results are statistically significant, that the winning variant is what caused the increase in conversions and, therefore, it should be implemented on the website. These tests generally take place before the variation is written into the site-code. Once the winning variant is established, the site-code is modified to display that variant… and that is when the magic happens: More conversions!

The Sorcerer’s Apprentice

So, what can be learned from the masterminds that used A/B testing to successfully optimize the output of donations for the 2012 Obama campaign? Well, for one, someone could probably come to the conclusion that A/B testing works! But of course there is plenty of other knowledge to be gleaned from their experience. One important lesson is that, although A/B testing works, it takes constant revision and continuous improvement to optimize the conversion rate. The analysts behind the campaign noted that certain variations that performed well initially, eventually began to putter out. When novelty wares off, the idea caps go back on and the creative juices must be churned in order to generate fresh optimizing content. It may not be the case in all situations that variations lose their luster, but in order to truly optimize, ways of continuously improving should always be considered.

Important lesson number two is that listening to the data, as bizarre as the results may seem, is essential. Something that Amelia Showalter, Director of Digital Analytics for the campaign, noted was that, in certain cases, ugly variants of the emails produced better conversion results than more aesthetically pleasing variants. In other words, people don’t always make sense. That is why it is important to make decisions based on what the data says and not based on the intuition of Joe Blow the Marketing Manager.

Finally, a truly optimized campaign integrates all sources of data, even data that has been collected from outside the digital sphere. A/B testing is magical when it comes to digital medium optimization, but the Obama campaign wasn’t successful only because of some fancy emails. They used data that they retrieved door to door, through various surveys, and through other databases that they had access to. Using this data, they predicted where it would be best to send Obama, Biden, Michelle, and Jill based on the needs and preferences of each demographic location in the United States. So, the moral of the story is that listening to the data is effective, but in order to truly optimize a campaign or promotion all sources of data need to be integrated and taken into consideration throughout the whole campaign.

Master of Sorcery

Apart from the Obama campaign, there are plenty of examples where A/B testing has worked wonders. Let’s look at an example from one analytics company that I have come to greatly admire. comScore is a digital analytics research and consulting firm that offers advice to some of the world’s most prestigious enterprises. I had the privilege of being able to hear Courtney Steffy, a Senior Client Service Analyst at comScore and Western alum, speak in my marketing research class last quarter and ever since, my interest in the data science industry has been growing. comScore conducts a wide variety of research for the benefit of all, but in this case they used A/B testing on their own website to see if it could help them generate more leads. After seeing that demo requests on comScore’s project pages were lower than expected, Ferry Gijzel, the Director of Web Marketing, decided to test three variations of a product page to see which one would generate more leads. The variation was in the layout and display of a client testimonial. The test revealed that displaying the client’s logo and testimonial vertically on the page next to the product description resulted in significantly more leads. In fact, this variant of the web page lead to a 69% increase in the conversion rate! Am I the only one who thinks this is crazy? There you have it folks… If done right, A/B testing can lead to amazing results.

Maybe I should include more company logos on my site…

comscore-logo

Boom! Let the magic begin!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s