Three Simple A/B Tests That Produced Surprising Results
With A/B and multivariate testing now fully on the radar of business owners and CMOs, it’s becoming clear that test results can have a measurable impact on eCommerce optimization, conversion, and personalization.
A/B testing lets you change almost anything on your website and observe the outcome before you make anything permanent. As we’ve mentioned before, your A/B tests should be driven by analysis and data. If not, you’ll just be testing for the sake of testing, which leads to unhelpful and harmful conclusions.
What happens when you do A/B testing correctly? Plenty of insights into to conversion, average order value, and add-to-cart rates, among many others.
Here are three A/B tests we performed for different clients’ websites and some of the improvements they achieved. You can conduct these tests using a variety of A/B testing platforms without technical expertise or development support.
Backstory: An eCommerce home goods retailer displayed every call-to-action button on their website with the same color, including add-to-cart buttons on product pages. Not only that, but the buttons were the same color as the retailer’s brand that appeared everywhere on the website.
As we researched the shopping behavior of customers, we noticed that many of them were arriving at product pages, but not adding items to their carts as much as we expected.
Hypothesis: We determined that either 1) visitors were arriving at the wrong product page, causing them to abandon the site or 2) they simply didn’t notice the add-to-cart button (especially on mobile devices).
Test Conducted: While there are many tests we could have conducted on button color, messaging, and size, we wanted the call-to-action button to stand out as much as possible without getting in the way of the shopping experience.
We chose a simple color test and changed the call-to-action buttons on product pages to red, which we believed would stand out in direct contrast to the client’s branding.
Outcome: That simple color change produced a 6.5 percent increase in add-to-cart rate, with confidence, and a 9 percent increase in overall conversion rate. We saw the biggest conversion rate increase in mobile users at 13.5 percent.
Backstory: An eCommerce sporting goods retailer had high cart abandonment rates, given their high average order value rates.
Upon deeper examination, we discovered that shoppers were stuck in a constant loop on the shipping page because they weren’t correctly filling out fields that the system required.
Trouble is, customers didn’t perceive these fields (like checking a box for being at least 18 and agreeing to the retailer’s terms and conditions) as requirements to complete shipping.
As a result they completely overlooked these fields, especially on mobile devices, which caused continual refreshing of the shipping page and error messages when they tried to complete their order.
Hypothesis: We determined that removing unnecessary fields would 1) advance more customers through the checkout process and 2) cause cart abandonment rates to plummet while increasing overall conversion rates, especially for new customers.
Test Conducted: We performed three sequential A/B tests after adjusting three fields on the checkout page:
- Removing the field, I agree I’m at least 18 years old
- Removing the field, I agree to the terms and conditions
- Auto-checking the box, use same address for billing
Another option was to conduct a complex multivariate test, but we preferred quick individual results over waiting for the customer data required to perform all three tests.
Outcome: Each test showed an improvement in conversion rate and a drop in cart abandonment rate. After we pushed all three tests into one experience, we saw a 7.8 percent increase in overall conversion rates and a 5.8 percent decline in cart abandonment rates.
Backstory: A home goods client was experiencing low conversion rates for customers who used product categories to navigate the website. Considering the client’s AOV and industry, we had a hunch their low rates were somehow related to their product categories.
After further analysis, we found that while some customers did reach product pages from the main navigation just fine (and ended up converting), a fair number of them never found product pages at all.
Hypothesis: We determined that 1) changing the order of product categories on the navigation bar would help customers find categories more easily and 2) placing the most popular categories on the far left of the navigation bar (where customers normally look when beginning to shop), would improve add-to-cart and conversion rates.
Test Conducted: We moved the two most popular categories from the right side of the navigation over to the far left, where customers’ eyes naturally gravitate toward.
Outcome: While we saw some movement with this simple navigation test, it didn’t produce the results we’d hoped for.
Conversion rates for new visitors remained flat while overall conversion rates decreased by 1.8 percent, add-to-cart rates by 1 percent, and average revenue per sessions by 3 percent. This caused us to completely turn off the test once we saw confidence in our KPIs.
It became clear that while, from our viewpoint, the retailer’s original navigation wasn’t optimal the navigation was better left alone for their customer base.
Not every A/B test you perform will be a home run success, but with data-driven testing you can learn more about your customers’ behaviors, even when tests produce negative results. You can simultaneously improve the shopping experience and have an impact your bottom line.
Patrick Cole is a digital strategy consultant and a core member of the insights and optimizations team at LYONSCG. He spends the bulk of his day mining for revenue-driving insights in a wide variety of digital platforms. During his tenure, Patrick has honed his ability to answer complex questions through a scientific approach. He can be found by the melodic and determined sound of his typing.