Data Integration Techniques

ETL, ELT and reverse ETL data from any source to any destination. Stream data in real time. Use API integration and event-driven data integration to connect systems. We work with data in the cloud and on-premise.

Data integration techniques

Choose the right technique for the task

Etlworks supports various data integration techniques, such as ETL, ELT, reverse ETL, data streaming, API integration and event-driven data integration. It works equally well with the data in the cloud and on-premise. Compare available techniques and choose the right one for the task.

Technique Documentation When to use
Technique
ETL

Extract, transform, load (ETL) is a three-phase process where data is extracted from the source, transformed and loaded into the destination. Etlworks supports any-to-any ETL. Most of the connectors in Etlworks can be used as a source as well as a destination.

Documentation When to use
  • When you need to ETL data from any source to any destination
  • When you need to execute complex transformations
  • When you need to process sources (files, databases tables) by a wildcard
Technique
ELT

Extract-Load-Transform (ELT) is a process, in which the transformation step is moved to the end of the workflow, and data is immediately loaded to a destination upon extraction. Etlworks supports executing complex ELT scripts directly in the target database, which greatly improves the performance and reliability of the data ingestion.

Documentation When to use
  • When you need to efficiently transform data of any size or type
  • When you need to process structured and unstructured big data
Technique
Reverse ETL

Reverse ETL is the process of syncing data from a source of truth like a data warehouse to a system of actions like CRM, advertising platform, or other SaaS app to operationalize data. Etlworks supports reverse ETL from Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse Analytics, Greenplum and any data warehouse built on top of the relational database.

Documentation When to use
  • When you need to push data into more systems, getting better use of your data
  • When you need to continuously sync data between different systems
Technique
Streaming data integration

Streaming data integration is the continuous collection, in-stream processing, pipeline monitoring, and real-time delivery of data. Etlworks supports streaming data from and to messages queues, CDC-enabled databases, MongoDB and other data sources.

Documentation When to use
  • When you need to collect and analyze information in real-time
  • When the collected information is constantly changing (for example CDC stream)
Technique
API integration

API integration is the connection between two or more applications via their APIs that allow systems to exchange data sources. Etlworks can connect to any REST, SOAP and GraphQL API.

Documentation When to use
  • When you need to collect data from the APIs
  • When you need to send data to the third-party APIs
Technique
Event-driven data integration

Event-driven data integrations are triggered by an event in one system, and they trigger a predefined corresponding event in another. Etlworks supports triggering data integration flows by inbound HTTP requests from third-party systems.

Documentation When to use
  • When you need to enrich data from multiple sources and expose it to third-party systems
  • When you need to ETL data sent by the third party system into any destination
Technique
Working with on-premise data

Etlworks uses Integration Agent to access on-premise applications and databases. An Integration Agent is a zero-maintenance, easy-to-configure, fully autonomous ETL engine which runs as a background service behind the company's firewall. It can be installed on Windows and Linux.

Documentation When to use
  • When you need to run data integration flows that require access to the on-premise data behind the firewall

Ready to Start Using Etlworks?

Try 14 Days Free
Start free trial
Get a Personalized Demo
Request Demo