Every year there seems to be a new emphasis in inbound marketing. In 2010 we were pushing hard to get found online via SEO, content and social media. This year we moved down the sales funnel a bit, looking for new ways to convert more visitors to leads via landing pages and calls to action. Where are we going in 2012? I believe we are continuing the progression down the funnel with conversion rate optimization (CRO) - getting the highest possible conversion rates for sales as well as for leads. One of the key methodologies in CRO is A/B testing.
In an excellent post, "Why A/B Testing Is Essential To Your Marketing", Nathan Yerian describes the value of A/B testing. "The average website converts less than one percent of its traffic into tangible leads, so marketers must focus on critical details like button colors, email subjects and landing page visuals to increase conversions." By testing different versions of a call-to-action graphic, different layouts and content on a landing page or different required fields in a form, inbound marketers can determine with precision which approaches convert with the highest rates. What are some best practices for setting up and running A/B tests?
A simple A/B test is an either/or switch. Change a single element between versions, for example the headline on a landing page or graphic, the primary color of a button or text, the placement of a form left or right, or the inclusion (or not) of a required field, for example, a phone number. If you change several things between test versions, it will be difficult to assess which factor had the largest impact on conversion rate. Yerian's example illustrates this. "For instance, imagine an experiment that simultaneously tests advertising copy and color schemes. Group A clearly demonstrates higher conversion rates than group B; however, the results are less than straightforward. What caused the outcome — the colors or advertisement copy? From a quantitative perspective, the answer to that question is inconclusive." Strategy is everything in designing a good A/B test, so give it some thought before you start designing one.
You can create your own A/B tests with any HTML editor, but how will you test them and analyze them? You need a technology that will present your different versions to visitors randomly so that your test isn't biased. Then you need to be able to study the results in a side-by-side comparison between versions. Your toolkit should make it easy to create, setup, run and analyze A/B tests. Without that easy workflow, you will probably not take the time to run them. Yerian lists Google Website Optimizer, Adobe Test & Target (Omniture), HubSpot and Optimizely as good candidates, and I agree with his choices. We use the Enterprise tools from HubSpot, since they integrate well with the entire inbound marketing platform.
You can start to draw conclusions right away as you run your A/B tests, but should you? Apart from the standard argument that all tests should achieve statistical significance prior to analysis, you also want to be sure that your tests aren't biased by other factors, for example:
Run your tests for several weeks, if not months, depending on your traffic. Don't tweak your tests in mid-stream or you will ruin their validity. Watch the trends and reap the rewards. Once all of your mid-funnel lead generation opportunities are optimized, you can expect to not only increase sales leads but also to increase conversions to customers through targeting strategy and analytics.
Follow @jmctigue |
Follow @Kuno |