Two weeks ago, we ran the annual market trends webinar. As I’m dealing with the aftermath, or “postmortem”, I’m still overwhelmed by the response to it. It clearly struck a chord (or nerve), as displayed by record attendance numbers, and the sheer interaction on and offline. There were far too many questions – over 300 – to cover off in a blog.
If you listen to the webinar recording, you’ll see many of them displayed, and answered. So, what I thought I’d do is to take the opportunity to define what I meant by postmodern analytics. To understand that, it’s worthwhile rewinding, and taking a look at what analytic platforms have looked like historically. Many of you have heard about the traditional and modern analytic platforms, as defined by Gartner.
Traditional BI platforms used to be the norm about 10 years ago. They were designed to support development of IT produced analytic content. Specialized tools and skills, and significant upfront data modeling – coupled with a predefined metadata layer, were required to access their analytic capabilities. It was all about the absolute truth of the author – a skilled “priesthood”.
Then came data discovery, which, as it scaled out, was later renamed modern BI and analytic platform. This paradigm was and is more about IT-enabled analytic content development, defined by a self-contained architecture that enables nontechnical users to autonomously execute full-spectrum analytic workflows from data access, ingestion and preparation to interactive analysis and the collaborative sharing of insights. The “truth” in the data became subjective to the reader, which often was the business analyst, app developer or power user.
What I’m proposing is that these two platforms, as they stand, aren’t equipped to meet the needs of the future, where organizations and society is heading. Helped by the four tenets of 3rd generation BI of 1.) data democratization, 2.) embeddability everywhere, 3.) AI as a driver for data literacy, and 4.) An open platform approach – as guiding lights, it should be about more than just augmented analytics. It should be an augmented insight system that enables the discovery, generation, sharing and delivery of insights across all your data and all your people. What distinguishes it is that it’s bi-directional, i.e. a system inclusive for everybody. It thrives off participation. Proactiveness of the system is key. Common, auto-discovered, or “crawled” catalogued data models will ensure that it’s easy to access the data in a performant way, when you need it. It’s a distributed architecture, reflecting how environments look today, where workloads, data, processes and usage can stay where it is. Analytics is only an interim means to an end. The fabric of the system accelerates insight delivery and action as the ultimate outcome. The postmodern system will give everyone the tools and the power to make critical decisions, and be critical of the information that they are being fed. It will be wide enough, diverse enough, and have the expertise embedded in it to help you do that critical thinking to help interrogate, investigate and truly understand what you are reading and seeing and work with it. It moves from reporting, to analysis, then on to an augmented insights system that thrives on participation. So, in relation to this, the over-arching question I’ve gotten a few times is, “as we move from truth to trust, is truth no longer something worthwhile pursuing?” It absolutely is. Think Wikipedia. The truth isn’t something that should be dictated by one entity. Rather, it’s something we should aspire to arrive at, in the context of where we are. A postmodern analytics system should help arrive at more trust in the pursuit of truth because of the fabric of people, data and analytic services that surround it.