Matillion EDR is used for data loading. It extracts data from various sources, stages it in a data warehouse environment, and then performs orchestration and transformation jobs to automate processes across different layers of the data warehouse.
We use it primarily for transferring data into our cloud data warehouse and conducting research. We rely on Matillion ETL for our data integration and transformation needs, finding its user-friendly interface and robust capabilities highly effective. Essentially, we've built a cloud-based data platform using Matillion ETL to seamlessly extract data from various sources, perform necessary transformations, and store it in our cloud environment. In our finance-focused projects, we predominantly utilize Matillion for data warehousing tasks, particularly in the realms of finance and micro-lending. While our projects may not involve big data volumes, they often entail handling intricate datasets that require sophisticated processing.
My primary use case involves handling standard ETL tasks. I work on processing both our company's data and third-party data sources. While I focus on these standard ETL tasks, my colleagues excel in more advanced pipeline work.
We primarily use Matillion ETL to effectively manage data's movement, ingestion, and transformation through pipelines. We have specific use cases that involve different types of data, but they all fall under the general bracket of data movement.
While loading data into Snowflake, I encountered an issue with the key due to the file's large size and a record count in the billions. Loading the data with a Python script was taking a long time, so I decided to explore other options. This is when I discovered Matillion ETL, which I had not heard of before. I learned more about it and used some of its features, including the Material Data Loader, to load the data into Snowflake. Using Matillion ETL, I was able to load around 770 million records in just five to ten minutes. This was a successful use case, and I have also used Matillion ETL for loading data from Amazon S3 to Snowflake and for other data-loading tasks, including connectivity to on-premise servers and different cloud platforms. I have used on-premise and cloud deployments of this solution.
Director of IT Operations at Broadridge Financial Solutions, Inc.
Real User
Top 5
2023-02-21T19:25:00Z
Feb 21, 2023
We have some very unique use cases for the solution. We are interfacing between the on-premises database and on-cloud database– Oracle and Snowflake. It is a very complex process wherein we had to ask for help from Matillion’s engineers to build it out. We're looking up our on-premises database servers for information to build our cloud database. Once we get everything copied in Snowflake, we go back to Oracle on-premises. We have created this bridge to use till we switch to AWS full-time. But right now, we don't have that in the book. So the best way to move forward is to make sure we're taking a solution that could bridge that gap, and that's Matillion.
Data Architect at Old Mutual Life Assurance Company (South Africa) Limited
Real User
2022-09-06T14:04:00Z
Sep 6, 2022
It is cloud native and designed to run on cloud warehouses. There is compatibility with many of the cloud data warehouses, as well as Snowflake, and any SQL data warehouse. It is not compatible with other ETL products.
Data & Analytics Practitioner (BIDW, Big Data) at Tech Mahindra Limited
Real User
2021-12-27T19:04:00Z
Dec 27, 2021
I am using Matillion ETL for Snowflake. We are doing a migration project from SAP BODS to Matillion for Snowflake. We are migrating all the BODS data flows, workflows, to Matillion jobs.
Managing Director at a tech services company with 51-200 employees
Real User
2021-02-15T11:57:38Z
Feb 15, 2021
We are a consulting company. We provide services to our customers. You can use this solution for every use case for which you use an on-premise solution. You can use it for cloud data warehousing in the cloud. You can use it with other cloud databases such as Snowflake or Synapse. We are using its latest version.
Matillion ETL is a powerful tool for extracting, transforming, and loading large amounts of data from various sources into cloud data warehouses like Snowflake. Its ability to load data dynamically and efficiently using metadata is a standout feature, as is its open-source ETL with good performance and high efficiency.
The solution has a graphical interface for jobs, is easily adjustable and extensible, and allows for scheduling and error reporting. Matillion ETL has helped...
The solution is used for data ingestion from multiple sources to Snowflake and extract and load.
Matillion EDR is used for data loading. It extracts data from various sources, stages it in a data warehouse environment, and then performs orchestration and transformation jobs to automate processes across different layers of the data warehouse.
We use it primarily for transferring data into our cloud data warehouse and conducting research. We rely on Matillion ETL for our data integration and transformation needs, finding its user-friendly interface and robust capabilities highly effective. Essentially, we've built a cloud-based data platform using Matillion ETL to seamlessly extract data from various sources, perform necessary transformations, and store it in our cloud environment. In our finance-focused projects, we predominantly utilize Matillion for data warehousing tasks, particularly in the realms of finance and micro-lending. While our projects may not involve big data volumes, they often entail handling intricate datasets that require sophisticated processing.
My primary use case involves handling standard ETL tasks. I work on processing both our company's data and third-party data sources. While I focus on these standard ETL tasks, my colleagues excel in more advanced pipeline work.
We use the solution to make data transfers between our source systems and Snowflake. It's our data analytics architecture.
We primarily use Matillion ETL to effectively manage data's movement, ingestion, and transformation through pipelines. We have specific use cases that involve different types of data, but they all fall under the general bracket of data movement.
We are using Matillion ETL for extracting and integrating the data from different applications, such as SQL, and other data sources.
While loading data into Snowflake, I encountered an issue with the key due to the file's large size and a record count in the billions. Loading the data with a Python script was taking a long time, so I decided to explore other options. This is when I discovered Matillion ETL, which I had not heard of before. I learned more about it and used some of its features, including the Material Data Loader, to load the data into Snowflake. Using Matillion ETL, I was able to load around 770 million records in just five to ten minutes. This was a successful use case, and I have also used Matillion ETL for loading data from Amazon S3 to Snowflake and for other data-loading tasks, including connectivity to on-premise servers and different cloud platforms. I have used on-premise and cloud deployments of this solution.
We have some very unique use cases for the solution. We are interfacing between the on-premises database and on-cloud database– Oracle and Snowflake. It is a very complex process wherein we had to ask for help from Matillion’s engineers to build it out. We're looking up our on-premises database servers for information to build our cloud database. Once we get everything copied in Snowflake, we go back to Oracle on-premises. We have created this bridge to use till we switch to AWS full-time. But right now, we don't have that in the book. So the best way to move forward is to make sure we're taking a solution that could bridge that gap, and that's Matillion.
It is cloud native and designed to run on cloud warehouses. There is compatibility with many of the cloud data warehouses, as well as Snowflake, and any SQL data warehouse. It is not compatible with other ETL products.
I am using Matillion ETL for Snowflake. We are doing a migration project from SAP BODS to Matillion for Snowflake. We are migrating all the BODS data flows, workflows, to Matillion jobs.
We are a consulting company. We provide services to our customers. You can use this solution for every use case for which you use an on-premise solution. You can use it for cloud data warehousing in the cloud. You can use it with other cloud databases such as Snowflake or Synapse. We are using its latest version.
We're populating a data warehouse, which is the primary focus, but we are also using it to populate our data lake with unstructured data.
* ETL process * Data warehouse
Bringing data from different sources onto our Snowflake data warehouse.
We use it to migrate data from in-house databases and other data sources into Amazon Redshift.
We use it for archiving and storing data.
Our primary use case is ETL.