Active Intelligence Magazine Cover

Schneider Electric: Navigating the fog

Optimizing data for analytics is helping Schneider Electric adapt to a world embracing sustainability

Martin Veitch

Editorial Director, IDG Connect

Schneider Electric is at the epicenter of the global shift to sustainability and digitization, so it’s logical that the company has put data at the heart of mapping its next moves, large and small. That suits Clint Clark, Vice President, Finance Performance Systems and Data, Global Finance, who is helping the company to fortify and integrate data pipelines to accelerate smart choices.

“Data is the power that determines how bright the signal is in the fog of uncertainty,” he says when we talk by video call. “When you build a pipeline that’s robust and has a strong current that’s real-time, you can illuminate those signals and people can make better decisions quicker. It allows us to be much more responsive to the ‘as is’ state and the changing currents.”

In finance, data can be used to deliver “truth-telling facts to help have uncomfortable conversations or to support and defend strategic initiatives”, explains Clark. But having an optimal data handling culture isn’t easy. One challenge he identifies is the “tragedy of the commons”, a term popular with economists to describe a situation where individuals’ actions aren’t coordinated and don’t create a common good. There exists “the moral hazard of people accepting data when it helps to highlight their point or trying to dismiss the data when it does not support their view of the world”.

Clint Clark on …

Managing by hunch

“System 1 (intuitive) thought processes add value and teach you how to take heuristic shortcuts but when you have changes in your underlying assumption, the only way you’re going to see the groundswell is through the data.”


“The gap between customer expectations and where you measure yourselves can lead you down the wrong road. Sometimes it’s about poking around the data and asking yourself ‘what if’ to see if your fundamental underpinnings still hold true.”


“A data mesh of domain-empowered teams coupled with DataOps where the data pipeline and analytical solution is embedded inside functional teams to empower them is the best practice I’ve seen.”

Data literacy

“To say that everybody is going to be a data scientist and understands how to tune hyperparameters is not a scenario we will ever live in. People will have different maturity levels.”

Analysis paralysis

“The downside [of modern data volumes] is that people can be paralyzed by the amount of information and understanding which of these are important signals and which are just noise.”

Clint Clark is Vice President of Schneider Electric

Data is enormously powerful, but it needs corralling and handling with care. Logging activities and data governance are constant challenges. It can also be easy to use data to show you’re hitting targets, but do those targets tally with what customers want?

And then there’s the question of how reliable the core data being analyzed is. Decision-makers need to understand what to do with bad data or when something goes wrong in the process, Clark says. Building a robust data catalogue is important for discovery, but needs to work in tandem with being transparent about the status and quality of the data.

Another ‘gotcha’ is bias. “You have to be willing to try to understand your hidden assumptions and hidden biases when they might pop up inside of the data,” Clark explains. But when data repeatedly illustrates you have an accurate view of the world and feeds great decisions, it becomes part of the fabric of a company.

And, after all the hard work of cleaning, integrating, analyzing and making decisions based on data, the prizes are large. Clark provides an example: in its North American finance operation, Schneider has created a toolkit that, for several years, has delivered forecasts that proved to be correct to within one percent. “When you can demonstrate that sort of predictability, and people know you’re going to deliver what you’ve said you’re going to deliver, that builds a lot of trust,” he says.

Clark believes that today’s data leaders enjoy powerful new opportunities produced by cloud computing, the Internet of Things, graph databases and other new tools. As an example, he says: “Qlik’s change data capture allows us to pull data from our source systems at a lower cost than historical solutions with greater consistency, and we get the benefits of streaming at the same time. It’s just the best of all worlds.”

Clark has also benefited from Schneider’s encouragement to be entrepreneurial and to try new things. “A lot of times, that meant the first thing fails,” he says. “The guidance I give to my team is ‘I expect you to fail; failure is your best lesson… just don’t fail at the same things for the same reasons repeatedly’.”

Ultimately, Clark says, data is “the lighthouse in the fog: you can make it to shore without it, but you may have to wait until the fog has lifted or it’s daylight”. He adds: “I love that that it changes so quickly and that it keeps me up at night learning. I’m always seeing new things and saying ‘ha, that’s interesting’… and two, three months down the road you say ‘a-ha, that might be the solution’.”

Written By

Martin Veitch

Editorial Director, IDG Connect

Martin Veitch is an experienced business and technology journalist and is currently Contributing Editor to IDG Connect. He has edited publications including CIO, ZDNet and IT Week, and specializes in writing in-depth interviews with industry leaders including Michael Dell, Steve Ballmer and Scott McNealy.

Share this article

Read more from Active Intelligence Magazine

Download the Active Intelligence Magazine