When building a new product or a new service how do you know that it is any good? Is it better than what it replaces? The fairly obvious answer to that question is: you measure the effect of the change and see if it is positive. This is data driven design!
Or, at least, it's design validated by data. Nearly all of the articles on data driven design focus on what is really hypothesis driven design. We, as the designers, craft some hypothesis which we believe will improve things and then we create an experiment to test the hypothesis, choosing the best outcome as the final design.
What they tend to skip over is the identification of issues and opportunities in the first place - the "data driven" bit. Products and services should be built with data capture built in from the outset so that issues and opportunities for improvement can be identified as a matter of course. The design is then truly driven by the data.
The combination of a data driven roadmap and design validation through data is when data driven design is at it's best.
Not so fast...
While data driven design sounds great it is not without it's pitfalls.
The proponents of A/B testing will tell you that you should narrow down your hypotheses to focussed areas and optimise around smaller cohorts of your users. This targeting of a particular user group will enable you to improve their performance without detriment to others.
However, It is this test specificity that can cause issues. Optimising around a single metric can clearly result in optimising for the wrong metric. An example of this is where a product is great at generating new leads but generates the wrong type of leads or weakens the brand overall, leading only to short term gain.
I think of this as the Turkey effect: at some point Turkey optimised it's tourism for cheap beach holidays when, in fact, it has all the ingredients for an up-market (and much more profitable) luxury tourism sector. Unfortunately, it seems stuck in the former now.
Optimising for the wrong metric is not the only issue.
It's also possible to become stuck micro-optimising around the same design when really the design should be replaced in it's entirety. I liken this to the classic machine learning problem whereby a learning algorithm gets stuck in a local minima but the global minima would be so much better.
Another thing to be wary of in data driven design is the trap of confirmation bias. It's all too easy, especially in smaller teams, to pick metrics that support our hypotheses instead of objectively testing them.
Where to now then?
Data driven design should be part of the designers toolkit. A way to discover areas for improvement and then to test that new designs are an improvement.
In "Designing with Data"1 the authors advocate 3 levels of designing with data:
- Data-driven design - where the data is the only truth
- Data-informed design - not just using the data
- Data-aware design - feeling free to ignore the data
Each of the approaches should be used in different situations. Data-driven design makes sense in a optimisation scenario - where you're trying to eke out as much benefit as you can from a design.
In our opinion, though, data-aware design makes the most sense in the majority of situations, especially if data-driven design is simply a subset of it. Talented designers should understand how to use data but also be able to bring their experience and apply both to the specific context they find themselves in.
Indeed, the ability to line up design with strategy is key to delivering sustainable growth. The alignment of strategy and design through use of the same KPIs ensures that there is no divergence between the two. This is most easily achieved where the goals of the business match the goals of the users. It's easiest to see this by looking at the counter example: the majority of the big internet companies rely on advertising to generate revenue from their products. Clearly there is no alignment between user and business goals and so the designer is trying to optimise for two, usually, competing things.
1. ISBN 1449334830, Rochelle King, Elizabeth F Churchill, Caitlin Tan