DATABASE STREAMING

Database streaming, or DB streaming, is a form of data streaming designed to deliver real-time insight that can improve business competitiveness.

What is data streaming? Data streaming involves real-time processing of data from up to thousands of sources such as sensors, financial trading floor transactions, e-commerce purchases, web and mobile applications, social networks and many more. By aggregating and analyzing these real-time data streams, enterprises can use database streaming to develop intelligence to improve agility, make better-informed decisions, fine-tune operations, improve customer service and act quickly to take advantage of business opportunity.

Effective database streaming requires a sophisticated streaming architecture and Big Data solution like Apache Kafka. Kafka is a fast, scalable and durable publish-subscribe messaging system that can support data stream processing by simplifying data ingest. Kafka can process and execute more than 100,000 transactions per second and is an ideal tool for enabling database streaming to support Big Data analytics and data lake initiatives.

But using Kafka for database streaming can create a variety of challenges as well. Source systems may be adversely impacted. A significant amount of custom development may be required. And scaling efficiently to support a large number of data sources may be difficult. That's where Attunity can help.

Manage database streaming with Attunity Replicate.

Attunity provides software solutions that promote heterogeneous data availability through data integration and Big Data management. Offering integration solutions for the industry's broadest array of platforms, we address the challenges of data ingestion and data replication in databases, data warehouses, Hadoop and SAP, as well as Kafka and other real-time messaging systems. Our solutions are designed for data residing on premises or in the cloud, as well as on legacy mainframe systems.

Benefits of database streaming with Attunity Replicate

Attunity Replicate software simplifies real-time data ingestion in Kafka to deliver significant benefits for database streaming.

  • Replicate enables real-time data capture by feeding live database changes to Kafka message brokers with low latency.
  • Administrators can use an intuitive and configurable graphical user interface to easily set up data feeds with no manual coding.
  • Attunity minimizes the impact of database streaming with log-based change data capture technology and a unique zero-footprint architecture that eliminates the need to install intrusive agents, triggers or timestamps on sources and targets.
  • Attunity Replicate provides an architecture and software that can scale to ingest data from thousands of databases.

In addition to Kafka, Attunity Replicate enables database streaming to Confluent, Amazon Kinesis, Azure Event Hub and MapR-ES.