Avoiding Data-Driven Disasters
Data. Is. Beautiful.
I spend the bulk of my day in one data platform or another. For those of us who use data, uncovering a subtle trend with a substantial strategic impact on your business feels euphoric. There is no greater feeling for data-nauts than translating raw information into valuable data-driven insights.
These joyous discoveries are precisely what makes data so dangerous. Premature action on a data point creates a dangerous situation where design decisions are made without validated insight. To help you avoid this pitfall, I’ve outlined some common mistakes that can quickly turn into larger data-driven disasters. When making data-driven decisions, specifically in regard to data-driven design, avoid these at all costs.
1. Lost In Translation
Metrics are constantly misused and misinterpreted. Even if a metric seems straightforward, taking the time to learn how the platform at hand defines the metric can provide valuable insight. Next, consider how that metric would apply to your audience’s behavior. For example, with site performance metrics, before you benchmark your data, think about how your site differs from other sites. How will this differentiation be reflected in the benchmark data?
For example, If you’re analyzing a competitor’s conversion rate, and you know they drive a large volume of less-qualified traffic from banner ads, then you should expect your conversion rate to be higher regardless of the health of your site experience. Even though your competitor has a high amount of traffic, the determining factor for conversion is quality, not quantity. The original traffic numbers don’t tell the whole story, and making a decision based on this initial translation could lead you in the wrong direction.
2. Metric Myopia
Never look at a metric in a vacuum. With so many variables feeding into design metrics, it takes multiple data perspectives to determine why changes occur. If your conversion rate changes, you’ll need to determine the degree to which three elements, covering a handful of metrics, contributed: traffic composition (did I drive less qualified traffic?), buyer mentality (did seasonality or promotion/urgency messaging change?), and buyer journey (did my site experience, or the way visitors use the site, change?).
Another recommendation to broaden your data horizons is to avoid “super metrics”. These statistics combine a handful of metrics into one index. Unless you are going to analyze the metrics that feed into it, these aggregated statistics can lead you astray.
For example, a client reached out to me for conversion rate optimization solutions. They had changed their domain structure during a redesign a few months back. After the redesign, their conversion rate shrunk rather dramatically. I asked them why they had performed the redesign in the first place, with the rationale being a solution to high bounce rates.
After looking at their historical data, it became clear that the high bounce rates resulted from a high proportion of visits landing directly on relevant product detail pages. Most websites drive initial visits to broader pages, and pint the visitors down a path to relevant content. Their initial bounce rate challenge was due to most visitors landing on desired product pages. After learning more, the visitors either added the product to their cart or left the site.
While the initial redesign improved bounce rates by making these visits path through more pages, it damaged conversion rates by adding unnecessary clicks to the shopping experience. Taking the time to understand why bounce rates were high would have saved this client considerable time and money.
3. The End of Intuition
The phrase “data-driven design” has turned into such a buzz-word. While making for great alliteration, its popularity may be leading to data-driven mistakes. I say this because the phrase implies that data should be the sole input into design decision making.
If you only reference your data when making design decisions, your insights will devolve into a feedback loop related to your current experience. Data can tell what is broken. Data can tell you what works. Data can’t tell you what to do next. In essence, data illuminates problems much more easily that it does opportunities.
Even if you complement quantitative data with broad market research, you still remain in the aforementioned feedback loop of what your customers currently think. Steve Jobs is famously quoted, “It’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them.” Making decisions based solely on available data won’t get you any closer to showing your customers what they’ve been looking for.
How, then, do we show the customer what they want? We need data informed design. Ideally – cost permitting – that data would include marketing research to fill the qualitative gaps, providing the why behind the what. With a holistic data set informing decisions, you free intuition to drive more aggressive strategies with substantially less risk. The longer you’re in a given field, the stronger your intuition. Don’t ignore your gut because a dashboard might indicate otherwise. Data should help clarify decisions, not make them for you.
Data is a phenomenal tool for helping make informed strategic design decisions. As we move through the age of Big Data, all too often data makes decisions instead informs them. When data is mis-translated, used in isolation, and substituted for intuition, dire strategic errors can be made. Keep these data-traps in mind as you move forward, and make sure your data is working for you.
James McDonald is a senior digital consultant on the LYONSCG Digital Marketing team. As the Analytics team lead, James focuses on aggregating and mining data for insights to guide digital strategies and optimization. He has over five years of experience in digital analytics, strategy, and marketing, and will gladly join your trivia team if you ever need an additional player.