What makes great data analysis?

I had this excellent question this week from a subscriber:

“What should someone on data insights be looking for? What should we aim to do? How do we make sense of little data? How and why should we clean our data before making any observations? … what makes a great data analysis?”

In fact it is several related questions so this weeks article will begin unpacking this so we can consider what is important and share some experience to help theirs (and your) thinking on the subject.

We start with definitions.

Part of the issue we have as data professionals is one of bad definitions and lots of jargon. This starts with the job title and the outcome can be poorly defined roles and responsibilities. If you’re an airline pilot your role is clear. If your role involves data then are you an analyst, scientist, engineer, programmer, statistician, visual designer? All of them? None of them?

Are you even a “data professional” when your main job title is something else but you use data all the time (e.g. mechanical engineer, academic researcher, sports coach etc.)?

In larger organisations they’ve somewhat solved the issue. They’ve developed their own norms, languages and hierarchies so everyone has at least some clarity. These are however only partly portable between other organisations and industries which causes further confusion. In smaller organisations with smaller teams or maybe just a singular data professional what is expected can become an even bigger blur – a case of we’ll know it when we see it.

So the answer to “What is great analysis?” often gets either “it depends” or worse “It is obvious isn’t it?” – which, by-the-way, I find it rarely is.

Perceptions and dealing with uncertainty

One underlying issue I’ve found, is in our own perceptions and expectations of others less-technically comfortable than ourselves.

As “numbers” people we get used to a level of certainty with our work i.e. 2+2 does equal 4. And even when we’re not sure, we take comfort from statistical significance instead (as at least we’ve a number.)

Uncertainty and subjectivity is therefore unsettling for us. And unfortunately this can mean, “great analysis” for one person is not the same as “great analysis” for someone else. If someone isn’t happy then what do you do? If you feel threatened by this then it is too easy to blame them, when perhaps should be looking at our own approaches first.

Of course, we want to get everything we can from the data – it is a challenge, it is solving a puzzle that we know has an answer. If we can do this then surely that is great? No?

We want to do a great job but need to have others recognise this, irrespective of their subjective opinions.

Would it not be better if we had more assurance of what great is?

Say, a way to know that you were definitely doing great work even before you’d finished and presented the results?

It not a straight forward and to answer each question fully would take longer than I suspect you’d be comfortable with in this post, but we will make a start.

A flavour of Critical Thinking

Try starting, not with the data or technology but by applying what some people call critical thinking.

From the ancient Greek philosopher Socrates:

“He established the importance of asking deep questions that probe profoundly into thinking before we accept ideas as worthy of belief.

In other words, apply the same rigour to your problem definitions that you do to your problem solutions.

By apply critical thinking to your organisations context, determine what would be valuable to know, when and then consider how you can use data, analysis and models, to help you.

Too often it is “What can we do with this data?”, rather than, “What is really important to know for the business?”

Intelligence Naive Questions

To do this in practise, I recommend taking a take a consultative approach. Consultants, love them or loth them, are really very good at this. We can learn from their practise.

I personally call this asking “Intelligent Naive” questions.

  • Do you really understand the core of how the organisation works?
  • Can you clearly define what success looks like for the organisation?
  • Have you any idea what sensitivities go into this?
  • What happens if we don’t do this?

What It Takes To Win

To give you an example from my day job: In professional sports we talk about a model of “What It Takes To Win.

You can see an example of the British Judo teams WITTW model in the image:

There are clearly flippant answers (beat the opposition?!) but you need to step down a level, and then break down that down further. This then links everything you’re doing together logically.

For example, what does it take to beat the opposition? In this sport they need proficiency in fundamental technical and tactical skills. They need to be able to apply this in competition. They need to pass a certain specific physical profile. They need good nutrition and body composition. They also need to be strong in the mind. They’ve then broken each of these legs down into specifics.

Finding and establishing these second order answers are where the gold is to be found. 

This is where you are combining unique experience and knowledge into something actionable and clear.

It is these second order answers you want to focus your own Intelligent Naive questioning around. As once these are clear, you’ve then the basis for a robust basis for your data and reporting to inform and support.

Deliver on this and there is no question your work will be described as great analysis – even if it is delivering news people don’t want to hear … 

In sport, once we understand (and agree on this!) we can plan all our activities accordingly – all our training, strategy and talent development aims become clear.

A Business Example

In business the same concepts can apply. Take for example the world of online SaaS businesses. It is not a world I personally know much about but you can use this approach to quickly establish an Intelligent Naive question set that will rapidly help your understanding, and from there develop your data and insights strategy to inform it.

So the flippant answer to What it Takes to Win in a SaaS, is, of course, that the business must make lots of profit.

The second order questions might consider the more important question of When? i.e. when does the business need to make lots of profit? What is the trade-off between reinvestment (for growth, and therefore more profit in the future) verses taking the profit now?

You might break that down further by considering the customer in one of your WITTW legs. And to flesh that out some Intelligent Naive questions you’d probably want specific answers too, might include:

  • How costly is it to get new customers?
  • How can we work a price on that?
  • How precise is that?
  • What does that price include / not include?
  • What is the customer acquisition price sensitive too?
  • In what circumstances?
  • What can we / are we, doing about that?
  • What is outside our control?
  • Do we really understand the customer buying journey?
  • What are they influenced by?
  • What are we doing to support their buying decisions?
  • How is this understanding of the customer influencing the product roadmap? The onboarding process? The value proposition?

Of course, once you have customers you then need to keep them (for quite a while in a SaaS !) before they’ve repaid this initial acquisition cost.

My assumption is that most SaaS companies are therefore obsessed by Churn (or the proportion of people cancelling / not renewing their subscriptions.) and all other metrics that ultimately indicate their levels of keeping customers happy.

CHI

A great example from a very successful SaaS company HubSpot is “CHI” – which stands for “Customer Happiness Index.” Here is a link to an entertaining video talk by Dharmesh Shah explaining it. He’s a great speaker and it is well worth a watch.

In essence “CHI” is a second order metric like one of the legs in the WITTW model.

You might write it is as “To make high profit, over the long-term we need to keep customers happy.

You’d then establish what it takes to deliver this more specifically and then track that.

What is nice about CHI is that it aims to pull all the sub parts together into an index.

When you take this approach is it great as you can improve them over time, as you learn more i.e. you can keep the same CHI metric reported out to the organisation but constantly improve the make up of the metric to improve it’s organisational value.

With indexes you can think of versioning your metrics (just like your code!)

It is very powerful.

And a method we’ve used successfully with many sport teams including motorsports teams.

The metric name stays the same but the content changes as you learn more and improve.

This is advanced stuff but has the most potential for high business value as its more closely linked to a meaningful business outcome.

For now the take away is this. The definition of the role of a Data Professional is vague. This leads to poor understanding of business value contribution that is often subjective and frustrating to deal with. To tackle this, I’ve used the WITTW model together with a consultative “Intelligent Naive” questioning approach. I’d recommend doing this prior to looking at any data or technology, to unearth what does success look like for your organisation. Once you’ve done this, then establish the associated the data and analysis framework from that.

Great data analysis is always linked to high organisational value and, in my experience, this approach ensures you achieve both.