In “The Second Machine Age,” Brynolfson and McAfee take note of a rapid ascent of AI capabilities thought, just a few years ago, decades away from reality: self-driving cars, music composition, victory on Jeopardy. They describe how this is now possible because of three properties unique to the data economy:
- Digital—Data is the fundamental resource in the new economy, and it can be replicated and delivered anywhere in the world instantaneously.
- Exponential—Information technologies have doubled in power every 2 years for the past 50 years (Moore’s law), and we can now afford to apply complex analytics to everything we do.
- Combinatorial—New digital products can be created by combining existing products, allowing extreme personalization and optimization.
Most important, these properties are symbiotic, creating an accelerating vortex that spits out innovation at a shocking rate.
This virtuous cycle of innovation depends on data, metadata, and AI working in concert to create a system that gets smarter over time. And the surprising thing is that they need to interact more as an orchestra than a factory.
Data represent facts; metadata represent the story
It is popular to draw the pipeline from raw data to intelligent insights as a one-way flow, starting with raw data sources, through stages of refinement and preparation, into BI and machine learning algorithms that produce insights. But this “data factory” model misses the biggest insight from human intelligence: prediction relies on comparing the current situation to a vast store of memories, and the outcome of our actions goes back in the memory bank. It’s not what we predict, but what we remember that gives us valuable experience.
In computer systems, memory is not just the historical data; it’s the metadata providing context, organization, and nuance to the data. Where did this data come from? Who is using it? What does it represent? Can I trust it? The data tells the facts; the metadata tells the story.
In fact, I view predictive models and other types of machine learning as metadata. They are a concise summary of relationships among data—“if the values of these variables are combined via this algorithm, they infer the likely value of this other variable with x confidence.” So as data is used to drive decision-making and insights, metadata stores what is learned—what works, when to use it, what is still uncertain—and this system gets smarter. One of our customers is using a broad set of models as a data source to create a “meta-model” to optimize across all of them. The recursion runs deep, turtles all the way down.
Achieving continuous acceleration
Next-generation financial firms like Ant Financial are building this ecosystem into every business process. Launched in 2014 by Alibaba, Ant is taking advantage of their greenfield opportunity to design a large-scale, digital business from the ground up. Yuan (Alan) Qi, a vice president and chief data scientist at Ant, says the company’s AI research is shaping its growth. “AI is being used in almost every corner of Ant’s business,” he says. “We use it to optimize the business, and to generate new products.” Their innovations include using new data streams, such as social networking data, to assess the creditworthiness of someone who does not have a bank account. That market alone is huge—2 billion people worldwide but only a fully automated, data and AI driven process can affordably address it, because over half the market are people with too few assets for a traditional bank account.
The move to a data—metadata—AI ecosystem also changes the makeup of the management team. Ant recently added Michael Jordan, a UC Berkeley professor and expert in statistics and machine learning, to its scientific board. Along with TD Bank, other traditional banks are getting into the fray. Royal Bank of Canada (RBC) is setting up AI labs in Toronto and Edmonton, as well as one in Montreal. Last January, RBC hired AI pioneer Dr. Richard S. Sutton as an academic adviser.
To achieve the continuous acceleration of the second machine age, the data ecosystem has to be created to automatically capture this virtual cycle in metadata. Every action is recorded, outcomes are stored, and history becomes our teacher. This is not a solo event – organizations have to act like orchestras, capturing their shared experiences and collaboration in memory and drawing on each other to reach their peak performance.
Michael Rhodes, TD Bank’s group head of innovation, technology and shared services, said, “We’ve made enormous investments in data and data infrastructure over the past several years.” The combination of their rich data assets, agile data platform, and now an AI engine will give them a quantum leap in insight and innovation, and make them a formidable competitor in the data economy.