What makes great data analysis?

I had this excellent question this week from a subscriber:

“What should someone on data insights be looking for? What should we aim to do? How do we make sense of little data? How and why should we clean our data before making any observations? … what makes a great data analysis?

In fact it is several related questions so this weeks article I will begin unpacking this so I can consider what is important and share some experience to help theirs (and your) thinking on the general subject of data analysis.

Perceptions & Uncertainty

One underlying issue I’ve found, is in our own perceptions and expectations of others less-technically comfortable than ourselves.

If you are a “numbers” person, you get used to a level of certainty with your work i.e. 2+2 does equal 4. And even when you are not sure, you take comfort from statistical significance instead (as at least you have a number.)

Uncertainty and subjectivity is therefore unsettling. And unfortunately this can mean, “great analysis” for one person is not the same as “great analysis” for someone else. If someone isn’t happy then what do you do? If you feel threatened by this then it is too easy to blame them, when perhaps should be looking at your own approaches first.

Of course, you want to get everything you can from the data – it is a challenge, it is solving a puzzle that you know has an answer. If you can do this then surely that is great?

You want to do a great job but need to have others recognise this, irrespective of their subjective opinions. The proof is in the pudding etc.

Critical Thinking

Try starting, not with the data or technology but by applying what some people call critical thinking.

From the ancient Greek philosopher Socrates:

“He established the importance of asking deep questions that probe profoundly into thinking before we accept ideas as worthy of belief.

In other words, apply the same rigour to your problem definitions that you do to your problem solutions.

By apply critical thinking to you determine what would be valuable to know and when. In motorsports I’m thinking about giving driver feedback.

You can then consider how to use data, analysis and models, to help you.

Too often it is “What can we do with this data?”, rather than, “What is really important to know right now?”


Intelligence Naive Questions

To do this in practise, I recommend taking a take a consultative approach. Consultants, love them or loth them, are really very good at this. We can learn from their practise.

I personally call this asking “Intelligent Naive” questions.

  • Do you really understand how things work?
  • Can you clearly define what success looks like for the driver?
  • Have you any idea what sensitivities go into this?
  • What happens if we don’t do this?

What It Takes To Win

To give you an example from my day job. In professional sports we talk about a model of “What It Takes To Win.

You can see an example of the British Judo teams WITTW model in the image below:

This links everything you’re doing together logically.

For example, what does it take to beat the opposition?

In this sport they need proficiency in fundamental technical and tactical skills. They need to be able to apply this in competition. There is a need to pass a certain specific physical profile. A good nutrition and body composition is important. They also need to be strong in the mind.

Like a racing driver really.

They’ve then broken each of these legs down into specifics for Judo, and then broken that down further into things they can train and measure against. But everything is linked.

This is where you are combining unique experience and knowledge into something actionable and clear.

It is these second order answers you want to focus your own Intelligent Naive questioning around. As once these are clear, you’ve then the basis for a robust basis for your data and reporting to inform and support.

Deliver on this and there is no question your work will be described as great analysis – even if it is delivering news people don’t want to hear … “Your slow, here, here and here … !” 🙂

In sport, once we understand (and agree on this!) we can plan all our activities accordingly – all our training, strategy and talent development aims become clear.


A Business Example

In business the same concepts can apply. Take for example the world of online SaaS businesses. It is not a world I personally know much about but you can use this approach to quickly establish an Intelligent Naive question set that will rapidly help your understanding, and from there develop your data and insights strategy to inform it.

So the flippant answer to What it Takes to Win in a SaaS, is, of course, that the business must make lots of profit.

The second order questions might consider the more important question of When? i.e. when does the business need to make lots of profit? What is the trade-off between reinvestment (for growth, and therefore more profit in the future) verses taking the profit now?

You might break that down further by considering the customer in one of your WITTW legs. And to flesh that out some Intelligent Naive questions you’d probably want specific answers too, might include:

  • How costly is it to get new customers?
  • How can we work a price on that?
  • How precise is that?
  • What does that price include / not include?
  • What is the customer acquisition price sensitive too?
  • In what circumstances?
  • What can we / are we, doing about that?
  • What is outside our control?
  • Do we really understand the customer buying journey?
  • What are they influenced by?
  • What are we doing to support their buying decisions?
  • How is this understanding of the customer influencing the product roadmap? The onboarding process? The value proposition?

Of course, once you have customers you then need to keep them (for quite a while in a SaaS !) before they’ve repaid this initial acquisition cost.

My assumption is that most SaaS companies are therefore obsessed by Churn (or the proportion of people cancelling / not renewing their subscriptions.) and all other metrics that ultimately indicate their levels of keeping customers happy.

Using Index’s – CHI

A great example from a very successful SaaS company HubSpot is “CHI” – which stands for “Customer Happiness Index.” Here is a link to an entertaining video talk by Dharmesh Shah explaining it. He’s a great speaker and it is well worth a watch.

In essence “CHI” is a second order metric like one of the legs in the WITTW model.

You might write it is as “To make high profit, over the long-term we need to keep customers happy.

You’d then establish what it takes to deliver this more specifically and then track that.

What is nice about CHI is that it aims to pull all the sub parts together into an index.

When you take this approach is it great as you can improve them over time, as you learn more i.e. you can keep the same CHI metric reported out to the organisation but constantly improve the make up of the metric to improve it’s organisational value.

With indexes you can think of versioning your metrics. The metric name stays the same but the content changes as you learn more and improve. It is very powerful.

And a method we’ve used successfully with many sport teams including motorsports teams.

In Summary

This is advanced stuff but has the most potential for high value as its more closely linked to a meaningful outcome.

For now the take away is this. I’ve used the WITTW model together with a consultative “Intelligent Naive” questioning approach to help teams use data analysis to improve performance.

Great data analysis is always linked to high organisational value and, in my experience, this approach ensures you achieve both.

I wrote this article originally for a more general data analysis audience however believe parts are still relevant give the prevalence of data in motorsports. Hopefully you found it interesting.