Resource Library

Qlik
Analyst Report
We lead in our industry so you can lead in yours. See why Gartner named Qlik a Leader for the 9th year in a row.
The BI Survey 2019
Analyst Report
Learn why BI users voted Qlik #1 for customer experience and performance satisfaction in the world’s largest BI survey in the large international BI vendors peer group.
Qlik eBook - 2020 Data & BI Trends
eBook
The data and BI landscape is changing – fast. What’s next, and how will it impact you? In this eBook, Qlik® reveals the most important emerging trends.
On-Demand Webinar

How does AXA Belgium Replicate its operational data into its Data Lake?

To preview your data, you must first have access to it. This operation can be difficult when distributed into different systems and silos which is why many companies choose to bring them together in a Data Lake. However, without governance or stewardship, these data lakes are rapidly turning into swamps, veritable brakes on innovation. .Join AXA Belgium to discover their journey from data silos to insights with Attunity and Cloudera. You will learn: What the IT landscape of AXA looked like and the challenges it posed for the company and how Attunity has simplified the unlocking of data from existing systems and the integration of these into Data Lake. How Cloudera made it possible to transform the data lake into a real data hub How Attunity and Cloudera are the elements that generate value for AXA's insurance use cases.

On-Demand Webinar

The Importance of DataOps in a Multi-Cloud World

There’s no denying that Cloud has evolved from being an outlying market disruptor to a mainstream method for delivering IT applications and services. In fact, it’s not uncommon to find that Enterprises use the services of more than one cloud at the same time. However, while a multi-cloud strategy offers many benefits, it also increases data management complexity and consequently reduces data availability. This webinar defines the meaning of DataOps and why it’s a crucial component for every multi-cloud approach.

On-Demand Webinar

Real-Time Data Pipeline Automation for Databricks

Attunity offers solutions for creating analytics-ready data sets on the Databricks’ Unified Analytics Platform. They are designed to automate streaming data pipelines to make data seamlessly available to accelerate machine learning (ML), artificial intelligence (AI) and data science initiatives. Watch this on demand webinar to learn from the experts at Databricks and Attunity to learn how your organization can benefit.

On-Demand Webinar

Learn how Aggreko are delivering business value by using data to drive insight in Microsoft Azure

In this webinar, you will see how world-leading provider of mobile modular power, temperature control and energy services Aggreko, have designed an effective technology stack in Azure to enable them to accelerate the delivery of value using Microsoft & Attunity products. How to drive business value from data to support business strategy Designing an effective data architecture to support insight-driven decision making Assessing and utilising the right technologies in the cloud to accelerate delivery of value Considerations, challenges and lessons learned along our data journey.

On-Demand Webinar

Maximizing the Value of Real-Time Data in Azure SQL Datawarehouse

Attunity and Microsoft have a solution to help data-driven organizations get more value from data. This live webinar will explain how to: - Implement a fully managed cloud data warehouse for enterprises, combining fast query performance with industry-leading data security - Raise the bar on cloud data warehouse price-performance per industry analyst GigaOm - Integrate real-time data to ADW with the industry's broadest range of sources using change data capture (CDC) technology - Automatically generate ETL code to help data architects rapidly configure and manage ADW.

On-Demand Webinar

Express Scripts: Driving Digital Transformation from Mainframe to Microservices

Express Scripts is reimagining its data architecture to bring best-in-class user experience and provide the foundation of next-generation applications. The challenge lies in the ability to efficiently and cost-effectively access the ever-increasing amount of data. This online talk showcases how Apache Kafka® plays a key role within Express Scripts? transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds. It will discuss how Attunity's change data capture (CDC) technology is leveraged to stream data changes to Confluent Platform, allowing a low-latency data pipeline to be built.

On-Demand Webinar

5 Key Requirements for DataOps Success

In this webinar, we will examine the five key steps to be successful with DataOps, including the process and cultural shift required. We will also discuss the results of enabling DataOps success, such as improved productivity, streamlined and automated processes, increased output and higher collaboration across teams. CDOs, Data Architects, Data Engineers & Analytics Teams Should Attend to Learn How to: · Better manage data flow across the data lifecycle ? from ingestion to provisioning to analytics · Derive tips from use cases involving data lakes, cloud and data warehousing for better business insights · Increase collaboration, productivity and business value · Leverage new technology and approaches to enable data pipeline automation.

On-Demand Webinar

Getting Started with Apache NiFi Across Industries

Apache NiFi is an easy to use, powerful, and reliable system to process and distribute data. It provides an end-to-end platform that can collect, curate, analyze, and act on data in real-time, on-premises, or in the cloud with a drag-and-drop visual interface. It?s being used across industries on large amounts of data that had stored in isolation which made collaboration and analysis difficult. Join industry experts from Hortonworks and Attunity as they explain how Apache NiFi and streaming CDC technology provides a distributed, resilient platform for unlocking the value of data in new ways. During this webinar, you will: Discover the advantages of Apache NiFi See how to create a NiFi dataflow Learn how industries like yours are using NiFi Accelerate data replication.

On-Demand Webinar

Generali real-time data streaming in the Insurance Industry - under 10 seconds from source to target!

Market leaders in every industry are beginning to see the power of real-time big data integration and how connecting it can enable them to re-imagine their markets, customers, products, and business models. Christian Nicoll, Director of Platform Engineering & Operations at Generali Switzerland guides us through their journey of setting up an event-driven architecture to support their digital transformation project.

On-Demand Webinar

Automating Modern Data Lake Pipelines

On-premises and cloud data lakes are popular for enabling analytics, big data storage, self-service data practices, and warehouse modernization. However, data management challenges can often make finding value extremely difficult. In fact, when data lakes first entered the market, many organizations simply dumped data into the lake, transforming them into entities more akin to swamps that were nearly impossible to leverage, navigate, or trust. The uncontrolled and undocumented swamp was the first generation of a data lake that had many failings, but it also provided numerous learning opportunities. Second generation data lakes have better internal organization, are typically governed better, and make use of modern ingestion technologies that support all forms of data and metadata integration. In addition, they leverage automated data pipelines as a best practice. . You will learn: - The evolution and necessity of the modern data lake - Options for efficient, real-time data ingestion at scale - The importance of data pipeline automation.

On-Demand Webinar

Modernizing Data Architecture With Streaming CDC

Modernising Data Architecture With Streaming CDC Expanding analytics requirements have increased the appetite for massive data volumes. However, these data flows can create bottlenecks, preventing timely and modern analytics innovations such as machine learning. This is where streaming Change Data Capture (CDC) comes in. By using CDC to power next-generation streaming environments, we can reconfigure data architectures to enable efficient, scalable and real-time data integration that protects production workloads. In this webinar, we discuss the market need for streaming CDC, how it differs from other techniques, and use cases and case studies.

On-Demand Webinar

Fanatics Ingests Streaming Data to a Data Lake on AWS

Fanatics, a popular sports apparel website and fan gear merchandiser, needed to ingest terabytes of data from multiple historical and streaming sources – transactional, e-commerce, and back-office systems – to a data lake on Amazon S3.