Industry Viewpoints

Calibrating For Crisis – Here Are the Data Trends Helping Businesses Thrive In 2023

Headshot of blog author Dan Sommer

Dan Sommer

5 min read

Graphic with text that reads "Data Brilliant Podcast with Dan Sommer. New Episode Out Now" with a headshot of Dan Sommer

Listen on Spotify

Whether inflation, global conflict or supply chain disruption, 2022 has been full of significant events that have fundamentally impacted the business landscape. The undercurrent is an accelerating de-globalization process. But this isn’t just geo-political in nature; the re-distribution of power and fragmentation of data are inextricably linked.

As a result, data-driven businesses are having to adapt to a more distributed world – fast. Whether it's because data professionals are harder to recruit, regulation is becoming more fragmented, or data is migrating to the edge, it represents a range of challenges and opportunities.

I had the privilege of joining Joe DosSantos on the Data Brilliant podcast again this year, where we discussed the implications of this increasingly ‘multi-polar world’. It was clear from our conversation that we’re going to have to find a way to deal with this trend much better than we have to date, particularly because businesses are expected to do more than ever with data amidst more change and resource constraints. We have to calibrate for crisis.

Calibrating the Decision

When we think about calibrating the decision, this is about aligning your business behind a modern way of solving a problem that is worth solving. Although many organizations have infrastructure in place to support real-time decision-making, the discipline is yet to reach its potential. It will be even more crucial next year – whether to move quickly in response to market changes, build contingency plans for potential crises, or capitalize on the opportunities presented by data processing at the edge.

Supply chain disruption has driven the need for real-time information, but it also happens on a much smaller scale. For example, Netflix gives recommendations based on viewing preferences. This might seem simple, but so much real-time data is used to achieve accuracy. It’s no longer enough to make decisions in a day, week, or month; some decisions must happen in the moment, otherwise they become irrelevant.

This brings us onto the second trend we discussed – pairing data velocity with decision velocity. The more data you have, the more repeatability you have, and the more opportunities there are to automate decisions. It’s all about shortening the time between data and decision. If you can do this for thousands of employees in an organization, it is going to have a huge impact.

Calibrating the Integration

Once an organization has calibrated its decision, it needs to calibrate the integration. That is to say, define the technology that supports decision-making and moves data to where it needs to be accordingly. Against the backdrop of de-globalization, a distributed world, and all the investments we’ve made to keep the lights on during the pandemic, we now have to play catch up around connected governance, responsiveness, and rising cloud costs. In this context, business leaders want to keep things simple, buying into platforms rather than engaging with a ‘wild west’ of varied data companies.

Adapting data infrastructure so it can connect data from disparate sources that are potentially governed by different rules will be a defining goal of 2023. This has given rise to the concept of the data fabric, or as Joe and I discussed, the ‘X Fabric.’ We spoke about the need to not only have this ‘semantic layer’ of technology for distributed data, but also for applications, which are increasingly being built by regular business users. As businesses move towards the ultimate goal of universal metadata within their business that can be accessed by everyone, the ‘X Fabric’ concept is emerging as a way to ensure governance for both the data and analytics processes that underpin this architecture.

Elsewhere, we expect to see AI move into the data pipeline at a deeper level. Imagine a world where analysts and other business users could spend more time analyzing data rather than preparing it. AI is making this possible by removing rote tasks, and finding value you hadn’t even thought of looking for in data sets. This will make the most of skilled employees’ time and keep them engaged.

@Dansommer explores with @JoeDosSantos on the Data Brilliant podcast how organizations can prepare themselves for constant crisis and what’s on the data horizon next year.

In this article:

Industry Viewpoints

Ready to get started?