Tuesday, 9 October 2018

Analytical Maturity, why organisations should embed analytics across all depts, so all depts grow together.

Last year, a friend of mine asked me my thoughts on an HBR article “If Your Company Isn’t Good at Analytics, It’s Not Ready for AI” (1). It was an interesting one and caused me to pause. I had started writing a blog post about it but abandoned it half-way. Today this topic is more relevant than ever. So here goes.

If you read some of the thousands of articles, or listen to people’s reactions around AI, you would have thought everything is AI nowadays. But this isn’t the case; why?

Similarly, the same friend was surprised to learn that some organisations such as tencent has a huge team of data scientists; so does Zhong An(2) by the way (at least 54% of their employees are engineers or technicians). That does seem to indicate that the deeper you get into analytics/ml/ai, the more data scientists/machine learning engineers… you need, not less.

Why is this so? Why can’t an organisation “put all the data in one machine and all answers come out” like what one of my bosses wanted? Can’t everyone just adopt AI?

If you build it, they will come (3)

I love the movie “field of dreams”. It’s about a whacky farmer who decides he wants to put all his eggs in one innovation and goes full hog, uprooting his previous business model and establishing a new one driven by his passion (and a voice in his head).

To me, the most memorable line of the movie is “If you build it, they will come”.

This is often the idea behind transformation projects, where the idea is to transplant analytics (let alone AI) into an organisation. Whether it is from a bunch of external consultants, or via hiring a relatively experienced head and build a team internally, the results are most often the same. There are only ghosts on the baseball field.

This is why I consider most Insurance company big data labs a failure; the men and women in white coats/uniforms are playing amongst themselves, and life goes on as per normal for ordinary folks, like 2 distinct worlds. So what is the RoI of these labs? To me, I would call this a failure since I believe the main aim of analytics/”Data Science” is to generate RoI and benefit all parties – grow the pie and everyone gets more.

So why the do many attempts at embedding analytics in organisations end in failure? Why do 85% of data lake projects fail (4)? The technology is there. Sure, there messy implementations, broken pipelines, choked pipelines, clogged processing engines, extremely dirty data where ELT works only sporadically or knowledge disappears with staff attrition…
harder
Well, as the article (4) says “More than 85 percent of respondents report that their firms have started programs to create data-driven cultures, but only 37 percent report success thus far. Big Data technology is not the problem; management understanding, organizational alignment, and general organizational resistance are the culprits. If only people were as malleable as data.”

Even if you build it, they may not come (for longer than the rah-rah show).

Basically, production of analytical pieces is ‘easy’. You drop me or any decent analytical person in a datalake, throw an SMEs and a good data engineer (my claim of analytics as a team sport (5)) and we are bound to catch some fish for the client and clean it, and cook it for him/her; but he/she is unlikely to know how to continuously include fish in his/her diet unless the people from the client team are ready.

What is most often missing is the ability to consume the pieces of analytics in a consistent and on-going manner.

Consumption of analytical/”data science”/AI output s not as obvious as you may think. And this is the part that most failed implementations have in common, (and also why I have consistently refused to join any organisation trying to transform itself in terms of analytics if the role focuses on the IT production side).

There can only be one, can’t there?

You could argue that it is only necessary to have 1 good consumer in the organisation, 1 department adopt analytics/”data science”, show the benefits and drag all other departments along. Afterall, once a piece of analytics is successful, each head of department can choose to adopt analytics and enjoy the benefits at a much lower risk.

There are 2 flaws in this argument. Firstly, we are forgetting the ability to consume, wanting to consume is one thing, but being able to (analytically mature enough) is not a given, Secondly, departments rarely exist in isolation in an organisation. A simple example will illustrate this.

A while ago, I was demonstrating how quickly a selection of customers based on their behavioural similarities can be made and readied for an experiment. I gave up when the customer informed me it usually takes 6 months to run a campaign (even a mini-one) and that was the only way to run experiments. An organisation often moves at the pace of its slowest department.

This brings us to organisational analytical maturity.

I will admit that this is a topic that is very close to my heart and mind at the moment (hence the idea to revive the blog from last year). I fundamentally believe that in order for an organisation to fully benefit from the advantages provided by analytics or eventually becoming data-driven, it is critical for all parts of the organisation to be pulling in the same direction and preferably at the same speed.So how do I define analytical maturity?

To me, the easiest way to understand how mature an organisation is, is to understand the kind of questions that the people within the organisation are trying to answer using data. 



The range of questions where analytics can provide a good answer ranges from what has happened to how can we make this happen. For simplicity the analytical maturity can be broken into 4 stages.

Descriptive Stage

The descriptive stage is the first encounter many organisations have with data. It often takes the shaped of backward looking reports: what has happened? How many of item X did I sell last month? This is a stage most organisations will be familiar with.

Diagnostic Stage

After getting the hand of static reports, the next stage is the diagnostic stage, where hypotheses are formed. Questions are asked around “why” and often require further slicing and dicing to find potential answers.

Predictive Stage

The predictive stage is when the questions move from looking backwards, to looking forwards. While concerns about the future may have been implicit in the diagnostic stage, it is in the predictive stage where specific tools, methodologies and algorithms are employed to uncover what is likely to happen, and often how likely it is to happen, what are the drivers of the behaviour.

Pre-emptive/Pro-active stage

At this more advanced stage, instead of taking certain variables/inputs as given and trying to predict the outcome, the idea is to influence the variables and thereby cause a change in the behaviour/status… Nudging, Behavioural Economics, Game Theory are common strategies and approaches.

A simple example can illustrate the difference, the “drain the swamp”(6) example:

·         Descriptive Stage: How many people voted against me?
·         Diagnostic Stage: Why did these people vote against me?
·         Predictive Stage: Who is that person likely to vote for?
·         Prescriptive Stage: How do I get that person to vote for me?

It is too easy to underestimate how difficult it can be for people to climb through the stages of analytical maturity, some never get to the pre-emptive/pro-active stage.

I believe that usually people do not want to make their lives harder than it is, hence the best way to make people in various parts of an organisation more analytically mature is by showing them direct benefits to their own selves. It is about change management.

At eternity’s gate (7)

For organisations with people in departments who are only used to static reports or even who are so busy that they don’t look at the reports, making descriptive analytics visual is a natural step. To anyone who is interested in helping make reports relevant to people, creating meaningful dashboards and triggering people to think using numbers, I would recommend books by Stephen Few (8); I had the opportunity to attend a course by the author a few years ago, and would like to think I learnt a lot and I try to follow the guidelines as much as I can.

The great thing about this book is that the principles can be applied using most software, so you can start from today itself.

One of the more logical approaches to (re-)introduce the use of simple reports in an organisation is to take stock of existing reports, gather business requirements, and do a gap analysis. In parallel or even prior to that, it would be good to have special purpose pieces of work answering specific ad hoc business questions. When immediate needs are met, the focus can switch to future needs, the discussion can move easier to dashboard design and ability to drill, slice and dice.

Basically the idea is to use ad hoc analyses and visualisations to encourage people to think about data, and to use data to try solve their problems, moving from the descriptive stage to diagnostic stage.

One of the important aspects of the diagnostic stage is the culture of experimentation. Hypotheses can be formed, may be even theoretically tested, but true learning comes from actual experimentation, and this gets more important in the next phase.

Back to the future (9)

The move from backward looking to forward looking is a very important one. Creating hypotheses (as in diagnostic stage) can still be done without knowledge of statistics for example, but evaluating them and making inferences requires some statistical knowledge, so does the evaluation of the results of experiments. This is even more so when one moves into the realm of predictive analytics.


Why statistics? Well I believe that having working knowledge of maths and stats allows the understanding of many techniques used for predictive analytics. And I will, as usual, place my favourite data science diagram (10):



Advanced Analytics/”Data Science” is concerned about predictions, and as it can be seen above, knowledge of stats/maths is an important characteristic of “data science”.

Once an organisation is comfortable in the world of creating hypotheses and possibly testing them, the next step is to use predictions to guide the ‘best’ course of action. It is important to note that in order to maximise the impact of predictive analytics, the culture of the organisation must have evolved to one of experimentation.

Once the culture of experimentation is established, we have a learning organisation and can become data driven. Again, it is important that experimentation permeates the organisation, it is critical to understand some experiments will not get the expected results, and learning from them is the point, not learning is a failure.

Minority Report: A Beautiful Mind (11)(12)

Predictive analytics assumes that the behaviour variables are given; pre-emptive/pro-active analytics attempts to change the behaviour. This falls in the realm of behavioural economics, game theory, nudging, precogs(11)… Most organisations are not there yet, plus there may be some ethical implications (after all the swamp hasn’t been drained yet, has it?)

In sum, analytical maturity is critical to ensure the successful adoption of the more advanced tools of analytics/”Data Science” (to me AI is a tool); to paraphrase the article quoted earlier (4), people are not ‘malleable’, putty is. So as long as we are dealing with people, change management, bringing people across an organisation up the analytical maturity stages is important.

However, that is not to say that it is not possible for organisation to engage technological leapfrogging. One of the interesting aspects of technology is that you do not need to understand it fully to use it to make decisions. As someone said in the Global Analytics Summit in Bali last year (you can find a piece I presented there in a previous blog post (13)), “managers who know how to use data to make decisions will replace managers who don’t”.

Once a technology gets to the bottom of the through of despair in the hype cycle (14), what brings is back up via the slope of enlightenment is that it starts getting applied beyond the purely technical hype, real life applications are what make technologies reach the plateau of productivity.




In Sum
To me it’s our job as analytics/“data science” practitioners to help organisations go through the analytical maturity. What about new technologies to come you would ask? The answer is that if an organisation is mature enough, has become data-driven, it will naturally seek to adopt new technologies and be competing with data.

So to answer my friend, yes, if an organisation is not doing analytics, it can’t simply adopt AI. However, it is not necessarily take that long to learn and become analytically mature, as long as there is a framework and commitment through-out to do so. And I would like to add, I certainly believe in technology leap-frogging, I am betting on it.


  1.  https://hbr.org/2017/06/if-your-company-isnt-good-at-analytics-its-not-ready-for-ai
  2. https://asia.nikkei.com/Business/Chinese-online-insurer-leaves-traditional-rivals-in-the-dust
  3. https://en.wikipedia.org/wiki/Field_of_Dreams
  4. https://www.techrepublic.com/article/85-of-big-data-projects-fail-but-your-developers-can-help-yours-succeed/
  5. http://thegatesofbabylon.blogspot.com/2018/08/if-you-dont-have-phd-dont-call-yourself.html
  6. https://www.tampabay.com/florida-politics/buzz/2018/03/20/and-i-was-in-florida-with-25000-people-going-wild/
  7. https://www.imdb.com/title/tt6938828/
  8. https://www.goodreads.com/book/show/336258.Information_Dashboard_Design
  9. https://www.imdb.com/title/tt0088763/
  10. http://drewconway.com/zia/2013/3/26/the-data-science-venn-diagram
  11. https://www.imdb.com/title/tt0181689/
  12. https://www.imdb.com/title/tt0268978/
  13. http://thegatesofbabylon.blogspot.com/2018/01/
  14. https://en.wikipedia.org/wiki/Hype_cycle

No comments:

Post a Comment