5 Web Page Testing Tips: Avoid Costly Mistakes
A/B testing, or comparing different versions of a web page to see which performs best, is a crucial part of running a website. Getting it right can take the guess work out of implementing what will work best for your customers, and in turn increase your sales, while getting it wrong can lead to disastrous results. To save you from conducting inaccurate or useless tests, here are five A/B testing mistakes to avoid from LYONSCG experts.
Mistake 1: Not running A/B tests with your company’s messages, graphical elements and mailing list
“Many marketers will read survey results or case studies and assume those results apply to them as well,” says Steve Susina, LYONSCG Marketing Director. “But just because Company X’s emails performed best with an 8 a.m. distribution, or their landing pages convert better with a green call-to-action button doesn’t mean yours will too.” With built-in capabilities in marketing automation/email marketing platforms or the availability of external A/B testing platforms, firms have no excuse but to test everything.
Mistake 2: Following a hunch
“Don’t test things that won’t pay-off!” says Jess Jenkins, Digital Analyst at LYONSCG. Collect user feedback through surveys and analyze customer behavior funnels to catch points of confusion or drop-off. Conduct your own audit of the conversion funnel and take note of your own responses to content placement, messaging and creative. “If it confuses you, chances are it’s confusing a customer,” adds Jenkins. “Avoid tests that don’t speak to your goals or your business – will a new call to action button color really add value to your strategy?”
Mistake 3: Failing to test assumptions
“Challenge yourself to take a step back and test the most basic of your assumptions,” says Dan Hutmacher, Senior Digital Consultant at LYONSCG. For example, an eCommerce Manager may test a $50 vs. a $100 threshold for free shipping, but may never test whether or not current customers even value free shipping at all. Similarly, it is easy to test the color of a button without considering its size, shape, or location. Relying on third-party research is good for orientation, but every business is unique. “Think critically and make sure you are testing foundations before accents.”
Mistake 4: Testing what you can’t deliver
“Conducting an A/B test on promotions or top-notch user experiences can give you eye opening results with lots of potential,” Jenkins notes “but those insights are useless if you can’t act on what you’ve learned.” Make sure you’re testing items that are actionable. For example, video content might be your ticket to better engagement but do you have the resources and plan to tap into this potential? Ensure that you can deliver what you learn from your tests.
Mistake 5: Not setting aside enough time for your testing
“One instance or one day is not long enough for reliable results,” says Hutmacher. With deadlines, quotas, and pressures to raise conversion and revenue, it’s often tempting to assume results prematurely. The proper timeline will vary based on the type of test, but you want to run it long enough to be sure that you’ve eliminated chance as a strong possibility. Like any good test, you need to ensure your results can be repeatable. Having a sufficient run-time is as critical as having comparable test groups.
Always Be Testing
Executing a strong A/B testing strategy to gather empirical data will help you determine which marketing strategies works best. Avoid the previously mentioned mistakes, and you’ll be well on your way to pleasing your customers and improving sales.
By using controlled tests and gathering empirical data, you can figure out exactly which marketing strategies work best for your company and your product.