Data Ingestion to Kafka and Streaming Platforms

Publish live transactions to modern data streams for real-time insights

Challenge

You can create new business value by injecting database transactions into Kafka, Amazon Kinesis, Azure Event Hub and other streaming systems. This enables advanced analytics use cases such as real-time event processing, machine learning and microservices.

The challenge is unlocking this value by replicating database updates to message streams - at scale - without cumbersome scripting or production impact.

Solution

Attunity Replicate® addresses these challenges with change data capture (CDC) technology that provides efficient, real-time, and low-impact replication from many source databases at once.

Real-time and low impact

With Attunity Replicate, IT organizations gain:

  • Real-time data capture. Feed live database updates to message brokers with low latency.
  • Agent-less solution. Our log-based change data capture architecture eliminates the need for software agents on source systems and does not impose additional database performance overhead.
Kafka and big data integration

Kafka and big data integration

  • Metadata updates. Support source schema evolution and integrate with schema registries.
  • Universal access. Feed message brokers that stream to sinks such as Hadoop, S3, Hive, Cassandra and MongoDB.

“Attunity is an important partner for both Confluent and the broader Kafka community. Their technology simplifies integration with Kafka, enabling customers to more quickly derive greater business value from their data with less effort.”

VP Business Development at Confluent, the company founded by the creators of Apache Kafka

Simple and high scale

Simple and high scale

  • No scripting. Rapidly configure, manage and monitor data flows with no manual scripting.
  • Scale. Support hundreds of sources, topics and targets.

Learn more about Attunity Replicate today.