Multivariate and A/B testing has been around for many years. The process is straight forward – make two or more variations of a webpage and randomly rotate them to visitors over time. Through analytics tracking webmasters can be fairly certain as to which version of the page produces the most desirable outcome.
The lessons learned through these tests overtime allow inbound marketers to deploy the desirable attributes on their new landing pages with the intent of maximizing conversion rates. Sometimes the results offer up surprises by debunking what user interface (UI) and user experience (UX) experts preach. This is good, because it helps break up some of the inbound marketing groupthink that can be pervasive. Below are tests we ran for Kuno, lessons learned and the take-a-ways moving forward.
SEO Cheat Sheet: Click to enlarge image
The SEO landing page test focused on four main areas – testimonial quotes, social sharing buttons, call to action arrow and the title of the form. Aside from those four areas the content remained the same on both landing pages. The test results are as follows:
Variation A |
Variation B |
|
Unique Visitors: |
377 |
363 |
Conversions: |
222 |
187 |
Visit to Lead Ratio: |
58.9% |
51.5% |
With 95% confidence the best performing landing page was variation A by performing 14% better than variation B.
Facebook Cheat Sheet: Click to enlarge image
The Facebook landing page test focused on five main areas – testimonial quotes, social sharing buttons, call to action arrow, the title of the form and the form field “phone.” The rest of the content remained the same on both landing pages. See the test results below:
Variation A |
Variation B |
|
Unique Visitors: |
259 |
261 |
Conversions: |
134 |
113 |
Visit to Lead Ratio: |
51.7% |
43.3% |
With 95% confidence the best performing landing page was variation A by performing 19% better than variation B.
With the way the tests were set up it's impossible to pin one particular page variable with 100% confidence as a winner or loser. This is because it was set up as an A/B test with multiple variables changed on each landing page. However, the results are still valuable because they provide a good list of variables which need further testing and offer general improvement guidelines for landing page elements moving forward.
It’s clear from both tests that directing your website visitors with an arrow to a form does not improve conversion rates. The results also suggest that testimonial quotes on a landing page may help drive trust, thus improving conversion rates. It’s also possible that social media sharing buttons on a landing page decreases its ability to convert. Additionally, the more descriptive a form title is the more likely the form will be filled out. Lastly, the idea that requiring people to include a phone number on a form will somehow reduce people’s desire to fill it out is debunked in this test.
While the assumptions made above cannot be made with 100% certainty, they do point us in the right direction for future testing. For more multivariate and A/B testing lessons learned we suggest subscribing to WhichTestWon.com and starting your own testing program today with HubSpot Enterprise inbound marketing software.
Follow @CPollittIU |
Follow @Kuno |
Image: ~db~
Join us for an exciting new webinar on Tuesday, January 31st at 12PM EST, 9 AM PST where we’ll discuss the reasons why traditional SEO campaigns aren't as important as they once were for web visibility and why they are quickly being replaced by inbound marketing.