This is the age of data and every company today takes pride in the fact that they are "data-driven". They measure and track every interaction of their users and they devise complex metrics to explain the behaviours of their users and the impact that has on their business.
It is undoubtedly great to measure and track everything that a user does. After all, without observing and understanding what the users are doing, it is impossible to make meaningful improvements to their experience.
However, the incredible amounts of data generated in the process can turn out to be a distraction if the noise isn't properly filtered out.
There have been many experiments to show that humans (and animals) love to attach explanations to what they observe. This is how they manage to navigate through the world. And this is how superstitions arise. The most famous experiment in this genre is where pigeons were given food at random times, which lead to each of the pigeons picking up superstitions. For example, one would rotate it's head in a particular way when it wanted food, as it had experienced such an action leading to food being delivered before.
Superstitions are essentially behaviours that we expect to lead to some consequences where we assume a causal relationship when there isn't actually one. Thus, we knock on wood when someone compliments us, or stop in our tracks when a black cat crosses our path.
Dealing with data leads to a lot of superstitions as well.
I've seen conclusions being drawn at companies where certain observed consequences are simply being explained after the fact and being treated as 'validated with data'.
We only validate something with data when we hypothesize first, design an experiment to test it and then collect and analyze the data coming out of that experiment. When we explain consequences after the fact with collected data, we are merely drawing hypotheses.
And yet, I've seen smart people confusing the two, making presentations and defining their future product strategy this way.
In product management and in life, if we fail to cut through the noise and scientifically arrive at causal relationships, we might simply be hacking away at the noise and thinking that we are being data-driven when we will only be data-driven in air quotes.
It is undoubtedly great to measure and track everything that a user does. After all, without observing and understanding what the users are doing, it is impossible to make meaningful improvements to their experience.
However, the incredible amounts of data generated in the process can turn out to be a distraction if the noise isn't properly filtered out.
There have been many experiments to show that humans (and animals) love to attach explanations to what they observe. This is how they manage to navigate through the world. And this is how superstitions arise. The most famous experiment in this genre is where pigeons were given food at random times, which lead to each of the pigeons picking up superstitions. For example, one would rotate it's head in a particular way when it wanted food, as it had experienced such an action leading to food being delivered before.
Superstitions are essentially behaviours that we expect to lead to some consequences where we assume a causal relationship when there isn't actually one. Thus, we knock on wood when someone compliments us, or stop in our tracks when a black cat crosses our path.
Dealing with data leads to a lot of superstitions as well.
I've seen conclusions being drawn at companies where certain observed consequences are simply being explained after the fact and being treated as 'validated with data'.
We only validate something with data when we hypothesize first, design an experiment to test it and then collect and analyze the data coming out of that experiment. When we explain consequences after the fact with collected data, we are merely drawing hypotheses.
And yet, I've seen smart people confusing the two, making presentations and defining their future product strategy this way.
In product management and in life, if we fail to cut through the noise and scientifically arrive at causal relationships, we might simply be hacking away at the noise and thinking that we are being data-driven when we will only be data-driven in air quotes.
CONVERSATION