Connecting Data Applications and Transactions to Kafka in Real Time

How Generali, Skechers and Conrad Electronics Keep Data in Motion with Qlik and Confluent

Most enterprise data resides in traditional databases, locked inside legacy mainframe systems and applications, such as SAP, Oracle and Salesforce. Organizations are looking to Apache Kafka to improve their data delivery speeds; however, many are challenged with traditional batch replication processes that struggle to support the ingestion of real-time and continuous changes in databases.

Keeping Data Fresh and Relevant

These challenges can be overcome with real-time change data capture (CDC) technology, which captures the changes, in real-time, as and when they occur in data and metadata. These incremental changes (as transactions) can be collected from the source and created as a stream delivered into Kafka. CDC combined with Kafka enables you to keep your streaming data in motion.

After these changes are captured, they must be delivered to the transaction data streams. But, what if you need to persist the stream of transactions – or the stream needs to be consumed by multiple targets?

As infrastructures grow, they quickly require more complex architectures to support multiple data consumption scenarios. More than 80% of Fortune 100 companies use Kafka. However, Kafka is an open-source technology and demands a lot of time and highly skilled resources (e.g., users with appropriate technical training and experience). This is where Confluent can help companies address this common problem when it comes to scale and support real-time data delivery with their managed data-in-motion platform – Confluent Cloud – which is running in production at thousands of companies.

Companies like Generali, Skechers and Conrad Electronics are using Qlik and Confluent to get more out of their investment in Kafka, offering better experiences for their businesses and their customers by significantly reducing their data replication times and improving operational efficiencies, thereby saving time and money.

Generali Profile:

Global Insurance Leader 1M+ Customers

Challenge

  • Traditional data management processes impacting business operations
  • Siloed & inconsistent data
  • Lack of integration across systems

Solution

  • Architect & deploy event streaming based on CDC and Kafka

Result

  • Improved processing speed & efficiency
  • Reduced data replication from days to seconds
  • Lowered cost of data operations

“We can replicate and stream data in just a few seconds. This could have taken days before. It’s a significant value to our business.” – Generali, Dir. of Platform Engineering & Operations

Skechers Profile:

Global Lifestyle & Performance Footwear

Challenge

  • On-prem legacy systems
  • Time-consuming data batch cycles
  • Long data development process

Solution

  • Shorten dev cycles by leveraging CDC and Kafka to hydrate data lakes in real-time

Result

  • Improved data visibility into and for the business
  • Better data cost management
  • Easier to scale infrastructure

“A very important thing to realize is that Qlik makes business visibility, cost management and scalability easier.” – Skechers, Data Engineering & Management


Conrad Electronics Profile:

German Electronics Retailer

Challenge

  • Data silo proliferation
  • Limited business insights due to data access barriers & lack of data accuracy

Solution

  • Migrate SAP data into Kafka
  • Leverage CDC APIs for other data sources
  • Consolidate in Google BigQuery

Result

  • Real-time data delivery
  • Improved customer experience
  • Faster data analysis
  • Time savings from improved automations

“The connection of Qlik to SAP systems and Confluent Kafka works without any problems, which enables us to realize our real-time use cases.” – Conrad Electronics, Head of Big Data Platforms

Qlik and Confluent Enhance Kafka’s Value

When implemented well, CDC enables data integration by replicating database or data source changes with little or no impact on those sources. Qlik Data Integration for CDC Streaming is a simple, low-impact solution for converting many sources – including databases and mainframes – to Kafka and Confluent in real time.

Qlik and Confluent can automatically produce real-time transaction streams into Kafka enabling streaming analytics as well as streaming ingestion into data lakes and data warehouse platforms. And unlock the potential of data from legacy systems with microservices environment integrations.

Watch this webinar to hear more about how Generali, Skechers and Conrad Electronics are using Qlik and Confluent to increase Kafka’s value. If you are facing challenges with Kafka, contact us to see how we can help you get the most out of your investment by:

  • Building a modern streaming foundation;
  • Streaming data, which has low-latency and low-impact, from mission critical systems;
  • Creating automated data pipelines for advanced streaming analytics use cases; and,
  • Automating continuous data streams into Confluent without coding.

Don’t wait: Keeping your data in motion will help keep your company in motion and your customers happy.

Keeping your #data in motion will help keep your company in motion & your customers happy: see how @Qlik & @confluentinc helped @GENERALI @Skechers & @conradgermany improve operations, efficiencies & customer experiences

 

In this article:

Keep up with the latest insights to drive the most value from your data.

Get ready to transform your entire business with data.

Follow Qlik