Director - Emerging Technologies at Speridian Technologies
Real User
Top 20
2024-07-18T13:09:00Z
Jul 18, 2024
Azure Data Factory is primarily used to orchestrate workflows and move data between various sources. It supports both ETL and ELT processes. For instance, if you have an ERP system and want to make the data available for reporting in a data lake or data warehouse, you can use Data Factory to extract data from the ERP system as well as from other sources, like CRM systems. Data Factory allows you to pull data from multiple systems, transform it according to your business needs, and load it into a data warehouse or data lake. It also supports complex data transformations and aggregations, enabling you to generate summary and aggregate reports from the combined data. Data Factory helps you ingest data from diverse sources, perform necessary transformations, and prepare it for reporting and analysis.
My main use case for Azure Data Factory is to pull data from on-premises systems. Most data transformation is done through Databricks, but Data Factory mainly pulls data into different services.
I can describe a scenario where I was tasked with developing a beta data trace and integration system. GitLab served as the data integration platform responsible for creating pipelines to extract data from various sources, including on-premises and cloud-based systems. Alongside GitLab Data Factory, we utilized Azure Logic Apps for orchestration and Azure Key Vault for securely storing data-related information. This combination enabled us to manage data extraction, transformation, and loading efficiently. The solution also involved Azure Data Lake for further data transformations, culminating in a comprehensive data processing engine that I played a key role in implementing.
Senior Devops Consultant (CPE India Delivery Lead) at a computer software company with 201-500 employees
Real User
Top 20
2024-03-13T08:10:00Z
Mar 13, 2024
Azure Data Factory is an all-in-one solution for ETL in our company. My company doesn't use the product for development purposes. I use the solution in my company as an ETL tool and for orchestration.
Data Governance/Data Engineering Manager at National Bank of Fujairah PJSC
Real User
Top 10
2024-03-06T16:25:15Z
Mar 6, 2024
We mainly use it to migrate data from on-premises sources to the cloud, such as Oracle and Cisco servers. It's a good solution for integrations within the Azure environment, and it connects well with other Azure data products. However, for external configurations, we use Informatica Cloud or Informatica Data Accelerator (IDA). For automation, we primarily rely on Snowflake and Informatica. Our strategy is not to depend on a single tool. When it's strictly on-premises to cloud, we use ADF. Otherwise, Informatica is more mature and integrates well with various third-party products. We also use Snowflake copy commands to load data into Snowflake. Azure Data Factory doesn't fully meet your automation requirements. We use Informatica for pipelines that were originally in SSIS, but for new pipelines and ETL processes, we choose either Informatica or Snowflake scripts.
My task involves extracting data from a source, performing necessary transformations, and subsequently loading the data into a target destination, which happens to be Azure SQL Database.
Senior Data Engineer at a photography company with 11-50 employees
Real User
Top 5
2023-07-17T20:50:00Z
Jul 17, 2023
In my company, we use Azure Data Factory for everything related to data warehousing. Depending on my customer's wants, I will use SSIS or Azure Data Factory. If my customers want Fivetran, I will use it for them. If the customer wants a suggestion from me on what they should use, then I will look at what they have today and their skills. According to the inputs I receive from my customers, I will recommend what makes more sense for a particular customer. I can be called a software agnostic.
I primarily use Data Factory to ingest data. For example, if we need to pull data into our data warehouse from somewhere like Azure Event Hub or salesforce.com.
Data Strategist, Cloud Solutions Architect at BiTQ
Real User
Top 5
2022-12-23T08:54:07Z
Dec 23, 2022
Our primary use case is for traditional ETL; moving data from the web and data sources to a data warehouse. I've also used Data Factory for batch processing and recently for streaming data sets. We have a partnership with Microsoft and I am a cloud solution architect.
CTO at a construction company with 1,001-5,000 employees
Real User
Top 20
2022-12-22T07:08:44Z
Dec 22, 2022
Our company uses the solution as a data pipeline. We get information outside the cloud from our factory such as data relating to production. We categorize it, clean it up, and transfer it to a database and data model. From there, we analyze the data using BI and other things. We gather information in data lake products like Microsoft Synapse and Microsoft Data Lake. We have two to three administrators who use the solution in a quite standard, mainstream way with nothing extreme. They handle administration, security, and development. It is difficult to define the total number of users because that depends on the number of data factory agents. We built the solution to have a different data factory agent for every customer. For example, if we have ten customers then we have ten users. We hope to increase usage but growth depends on our marketing efforts and how well we sell our products.
Senior Consultant at a computer software company with 1,001-5,000 employees
Consultant
Top 20
2022-11-25T11:59:02Z
Nov 25, 2022
Our primary use case is mainly for ETL and transforming the data and then using it for power VA. So there we are handling multiple projects. It is not just a single thing, but mainly used at the end is data analytics only.
Engineering Manager at a energy/utilities company with 10,001+ employees
Real User
2022-10-11T14:23:14Z
Oct 11, 2022
We use this solution to ingest data from one of the source systems from SAP. From the SAP HANA view, we push data to our data pond and ingest it into our data warehouse.
The current use is for extracting data from Google Analytics into Azure Sql db as a source to our EDW. Extracting from GA was a problematic with SSIS. The larger use case is to assess the viability of the tool for larger use in our organization as a replacement for SSIS with our EDW and also as an orchestration agent to replace SQLAgent for firing SSIS packages. The initial rollout was to solve the immediate problem while assessing its ability to be used for other purposes within the organization. And also establish the development and administration pipeline process.
We use Data Factory for automating ETL processes, data management, digital transformation, and scheduled automated processes. My team has about 11 people, and at least five use Data Factory. It's mostly data engineers and analysts. Each data analyst and engineer manages a few projects for clients. Typically, it's one person per client, but we might have two or three people managing and building out pipelines for a larger project.
Azure Data Factory allows us to provide BI service. We pull the data and put it into Synapse. From there, we create our dimension fact tables that are being used for reporting.
I am a manager of a team that uses this solution. Azure Data Factory is primarily used for data integration, which involves moving data from sources into a data lake house called Delta Lake.
IT Functional Analyst at a energy/utilities company with 1,001-5,000 employees
Real User
2022-03-31T20:05:00Z
Mar 31, 2022
We are currently using it as an ETL (Extract, Transform, and Load) tool. We are using it to connect to various information providers or, in general, to various sources, to extract data, and then to insert it to our storage devices, databases, or data warehouses.
Lead BI&A Consultant at a computer software company with 10,001+ employees
Real User
2021-08-31T13:03:00Z
Aug 31, 2021
We had an old, traditional data warehouse. We decided to put it into the cloud and we used Azure Data Factory to reform the EEL process from SQL server integration services to extra data.
Our customers use it for data analytics on a large volume of data. So, they're basically bringing data in from multiple sources, and they are doing ETL extraction, transformation, and loading. Then they do initial analytics, populate a data lake, and after that, they take the data from the data lake into more on-premise complex analytics. Its version depends on a customer's environment. Sometimes, we use the latest version, and sometimes, we use the previous versions.
Principal Engineer at a computer software company with 501-1,000 employees
Real User
2021-05-17T14:02:46Z
May 17, 2021
We primarily used this solution for getting data from a client's server, or online data, to an Azure Data Lake. We create pipelines to orchestrate the data flow from source to target.
.NET Architect at a computer software company with 10,001+ employees
Real User
2021-04-17T15:16:14Z
Apr 17, 2021
I use Azure Data Factory in my company because we are implementing a lot of different projects for a big company based in the USA. We're getting certain information from different sources—for example, some files in the Azure Blob Storage. We're migrating that information to other databases. We are validating and transforming the data. After that, we put that data in some databases in Azure Synapse and SQL databases.
General Manager Data & Analytics at a tech services company with 1,001-5,000 employees
Real User
2021-03-10T08:56:59Z
Mar 10, 2021
The solution is primarily used for data integration. We are using it for the data pipelines to get data out of the legacy systems and provide it to the Azure SQL Database. We are using the SQL data source providers mainly.
Senior Manager at a tech services company with 51-200 employees
Real User
2021-02-14T15:56:02Z
Feb 14, 2021
My primary use case is getting data from the sensors. The sensors are installed on the various equipment across the plant, and this sensor gives us a huge amount of data. Some are captured on a millisecond basis. What we are able to do is the data into Azure Data Factory, and it has allowed us to scale up well. We are able to utilize that data for our predictive maintenance of the assets of the equipment, as well as the prediction of the breakdown. Specifically, we use the data to look at predictions for future possible breakdowns. At least, that is what we are looking to build towards.
We are not using this product specifically as a data factory. We have taken Synapse Analytics as the entire component for the data warehousing solution. Azure Data Factory is one of the components of that, and we are using it for ETL.
Director at a tech services company with 1-10 employees
Real User
2020-12-09T10:31:00Z
Dec 9, 2020
Azure Data Factory is for data transformation and data loading. It works from your transaction systems, and we are using it for our HRMS, Human Resource Capital Management System. It picks up all the transactional data pick and moves into the Azure Data Warehouse. From there, we would like to create reports in terms of our financial positions and our resource utilization project. These are the reports that we need to build onto the warehouse. The purpose of Azure Data Factory is more about transformations, so it doesn't need to have a good dashboard. But, it has a feeding user interface for us to do our activities and debug actions. I think that's good enough.
Business Unit Manager Data Migration and Integration at a tech services company with 201-500 employees
Real User
2020-10-21T11:46:00Z
Oct 21, 2020
We use this solution for data integration. We use it to feed operational data into a data warehouse. We also use it for creating connections between applications. Within our organization, there are a few thousand users of Azure Data Factory. We believe that the number of customers and usage of this product will extend over the next few years. For this reason, we invest a lot of resources in building skills, and we make sure to hire consultants who know their way around Data Factory.
The primary use case is integrating data from different ERP systems and loading it into Azure Synapse for reporting. We use Power BI for the reporting side of it. We also have customers who are migrating to Azure Data Factory and we are assisting them with making the transition.
CTO at a construction company with 1,001-5,000 employees
Real User
Top 20
2020-08-19T07:57:30Z
Aug 19, 2020
We are using this solution to gather information from SCADA systems, analyze it using AI and machine learning, and then sending the results to our users. They receive and view the data using the Power BI interface.
Delivery Manager at a tech services company with 1,001-5,000 employees
Real User
2020-01-12T12:03:00Z
Jan 12, 2020
We are a tech services company and this is one of the tools that we use when implementing solutions for our clients. I am currently managing a team that is working with the Azure Data Factory. Our clients that use this solution are migrating their data from on-premises to the cloud. One of our clients is building an integrated data warehouse for all of their data, using this solution. It is used to extract all of the data from different servers and store it into one place.
Azure Technical Architect at a computer software company with 10,001+ employees
Real User
2019-12-31T09:39:00Z
Dec 31, 2019
It's an integration platform, we migrate data across hybrid environments. We have data in our cloud environment or on-prem system so we use it for when we want to integrate data across different environments. It was a problem for us to get data from different hybrid environments.
Sr. Technology Architect at Larsen & Toubro Infotech Ltd.
Real User
2019-12-09T10:58:00Z
Dec 9, 2019
There was a need to bring a lot of CRM and marketing data for some PNL analysis. We are connecting to the Salesforce cloud. In it, there's a specific solution in Salesforce Core CRM for the pharmaceutical industry. We are using the solution to connect to that and we are bringing in the various dimensions and transactions from that data source.
Principal Consultant at a tech services company with 11-50 employees
Real User
2019-07-29T10:11:00Z
Jul 29, 2019
We are working on a data warehouse integration which means that I am working on some big data projects. I'm preparing data for the licensing. One of the projects is preparing data in Azure Data Lake, to run some transformation scripts, perform some ETL processing, and to fulfill the stage layer of the data warehouse. It means that I help with ETL use cases.
Used Azure Data Factory, Data Flow (private preview) and Databricks to develop data integration processes from multiple and varied external software sources to an OLTP application Azure SQL database. The tools are impressively well-integrated, allowing quick development of ETL, big data, data warehousing and machine learning solutions with the flexibility to grow and adapt to changing or enhanced requirements. I can't recommend it highly enough.
Azure Data Factory efficiently manages and integrates data from various sources, enabling seamless movement and transformation across platforms. Its valuable features include seamless integration with Azure services, handling large data volumes, flexible transformation, user-friendly interface, extensive connectors, and scalability. Users have experienced improved team performance, workflow simplification, enhanced collaboration, streamlined processes, and boosted productivity.
We use the solution for building a few warehouses using Microsoft services.
Azure Data Factory is primarily used to orchestrate workflows and move data between various sources. It supports both ETL and ELT processes. For instance, if you have an ERP system and want to make the data available for reporting in a data lake or data warehouse, you can use Data Factory to extract data from the ERP system as well as from other sources, like CRM systems. Data Factory allows you to pull data from multiple systems, transform it according to your business needs, and load it into a data warehouse or data lake. It also supports complex data transformations and aggregations, enabling you to generate summary and aggregate reports from the combined data. Data Factory helps you ingest data from diverse sources, perform necessary transformations, and prepare it for reporting and analysis.
My main use case for Azure Data Factory is to pull data from on-premises systems. Most data transformation is done through Databricks, but Data Factory mainly pulls data into different services.
The platform simplifies data access and visualization with minimal coding, catering to various data management needs across different client projects.
We use the product for data warehouses. It helps us to load data to warehouses.
I can describe a scenario where I was tasked with developing a beta data trace and integration system. GitLab served as the data integration platform responsible for creating pipelines to extract data from various sources, including on-premises and cloud-based systems. Alongside GitLab Data Factory, we utilized Azure Logic Apps for orchestration and Azure Key Vault for securely storing data-related information. This combination enabled us to manage data extraction, transformation, and loading efficiently. The solution also involved Azure Data Lake for further data transformations, culminating in a comprehensive data processing engine that I played a key role in implementing.
Azure Data Factory is an all-in-one solution for ETL in our company. My company doesn't use the product for development purposes. I use the solution in my company as an ETL tool and for orchestration.
We mainly use it to migrate data from on-premises sources to the cloud, such as Oracle and Cisco servers. It's a good solution for integrations within the Azure environment, and it connects well with other Azure data products. However, for external configurations, we use Informatica Cloud or Informatica Data Accelerator (IDA). For automation, we primarily rely on Snowflake and Informatica. Our strategy is not to depend on a single tool. When it's strictly on-premises to cloud, we use ADF. Otherwise, Informatica is more mature and integrates well with various third-party products. We also use Snowflake copy commands to load data into Snowflake. Azure Data Factory doesn't fully meet your automation requirements. We use Informatica for pipelines that were originally in SSIS, but for new pipelines and ETL processes, we choose either Informatica or Snowflake scripts.
My task involves extracting data from a source, performing necessary transformations, and subsequently loading the data into a target destination, which happens to be Azure SQL Database.
In my company, we use Azure Data Factory for everything related to data warehousing. Depending on my customer's wants, I will use SSIS or Azure Data Factory. If my customers want Fivetran, I will use it for them. If the customer wants a suggestion from me on what they should use, then I will look at what they have today and their skills. According to the inputs I receive from my customers, I will recommend what makes more sense for a particular customer. I can be called a software agnostic.
We primarily use the solution for data warehousing. I'm integrating data from multiple sources.
The primary use case is to connect to various different data sets and do an EAT into our data warehouse.
I primarily use Data Factory to ingest data. For example, if we need to pull data into our data warehouse from somewhere like Azure Event Hub or salesforce.com.
Our primary use case is for traditional ETL; moving data from the web and data sources to a data warehouse. I've also used Data Factory for batch processing and recently for streaming data sets. We have a partnership with Microsoft and I am a cloud solution architect.
Our company uses the solution as a data pipeline. We get information outside the cloud from our factory such as data relating to production. We categorize it, clean it up, and transfer it to a database and data model. From there, we analyze the data using BI and other things. We gather information in data lake products like Microsoft Synapse and Microsoft Data Lake. We have two to three administrators who use the solution in a quite standard, mainstream way with nothing extreme. They handle administration, security, and development. It is difficult to define the total number of users because that depends on the number of data factory agents. We built the solution to have a different data factory agent for every customer. For example, if we have ten customers then we have ten users. We hope to increase usage but growth depends on our marketing efforts and how well we sell our products.
We primarily use the solution in a data engineering context for bringing data from source to sink.
Our primary use case is mainly for ETL and transforming the data and then using it for power VA. So there we are handling multiple projects. It is not just a single thing, but mainly used at the end is data analytics only.
Our primary use case for the solution is data integration and we deploy it only on Azure.
The primary use case of this solution is to extract ETLS, transform and load data, and organize database synchronization.
We use this solution to ingest data from one of the source systems from SAP. From the SAP HANA view, we push data to our data pond and ingest it into our data warehouse.
I primarily use Data Factory for data ingestion and B2B transformation.
We mainly use this solution to carry out data movement and transformation.
The current use is for extracting data from Google Analytics into Azure Sql db as a source to our EDW. Extracting from GA was a problematic with SSIS. The larger use case is to assess the viability of the tool for larger use in our organization as a replacement for SSIS with our EDW and also as an orchestration agent to replace SQLAgent for firing SSIS packages. The initial rollout was to solve the immediate problem while assessing its ability to be used for other purposes within the organization. And also establish the development and administration pipeline process.
We use Data Factory for automating ETL processes, data management, digital transformation, and scheduled automated processes. My team has about 11 people, and at least five use Data Factory. It's mostly data engineers and analysts. Each data analyst and engineer manages a few projects for clients. Typically, it's one person per client, but we might have two or three people managing and building out pipelines for a larger project.
Azure Data Factory is an integration tool, an orchestration service tool. It’s for data integration for the cloud.
We use Azure Data Factory for data transformation, normalization, bulk uploads, data stores, and other ETL-related tasks.
Azure Data Factory allows us to provide BI service. We pull the data and put it into Synapse. From there, we create our dimension fact tables that are being used for reporting.
Depending on their pipeline, our customers use Azure Data Factory for their ELT or ETL transformation processes.
I am a manager of a team that uses this solution. Azure Data Factory is primarily used for data integration, which involves moving data from sources into a data lake house called Delta Lake.
We are currently using it as an ETL (Extract, Transform, and Load) tool. We are using it to connect to various information providers or, in general, to various sources, to extract data, and then to insert it to our storage devices, databases, or data warehouses.
We use this solution to perform ELTs so that we do not need to keep code within a database.
My primary use case of Azure Data Factory is supporting the data migration for advanced analytics projects.
We had an old, traditional data warehouse. We decided to put it into the cloud and we used Azure Data Factory to reform the EEL process from SQL server integration services to extra data.
Our customers use it for data analytics on a large volume of data. So, they're basically bringing data in from multiple sources, and they are doing ETL extraction, transformation, and loading. Then they do initial analytics, populate a data lake, and after that, they take the data from the data lake into more on-premise complex analytics. Its version depends on a customer's environment. Sometimes, we use the latest version, and sometimes, we use the previous versions.
We primarily used this solution for getting data from a client's server, or online data, to an Azure Data Lake. We create pipelines to orchestrate the data flow from source to target.
I use Azure Data Factory in my company because we are implementing a lot of different projects for a big company based in the USA. We're getting certain information from different sources—for example, some files in the Azure Blob Storage. We're migrating that information to other databases. We are validating and transforming the data. After that, we put that data in some databases in Azure Synapse and SQL databases.
The solution is primarily used for data integration. We are using it for the data pipelines to get data out of the legacy systems and provide it to the Azure SQL Database. We are using the SQL data source providers mainly.
My primary use case is getting data from the sensors. The sensors are installed on the various equipment across the plant, and this sensor gives us a huge amount of data. Some are captured on a millisecond basis. What we are able to do is the data into Azure Data Factory, and it has allowed us to scale up well. We are able to utilize that data for our predictive maintenance of the assets of the equipment, as well as the prediction of the breakdown. Specifically, we use the data to look at predictions for future possible breakdowns. At least, that is what we are looking to build towards.
We are not using this product specifically as a data factory. We have taken Synapse Analytics as the entire component for the data warehousing solution. Azure Data Factory is one of the components of that, and we are using it for ETL.
Azure Data Factory is for data transformation and data loading. It works from your transaction systems, and we are using it for our HRMS, Human Resource Capital Management System. It picks up all the transactional data pick and moves into the Azure Data Warehouse. From there, we would like to create reports in terms of our financial positions and our resource utilization project. These are the reports that we need to build onto the warehouse. The purpose of Azure Data Factory is more about transformations, so it doesn't need to have a good dashboard. But, it has a feeding user interface for us to do our activities and debug actions. I think that's good enough.
The primary use case of this solution is for data integration.
We use this solution for data integration. We use it to feed operational data into a data warehouse. We also use it for creating connections between applications. Within our organization, there are a few thousand users of Azure Data Factory. We believe that the number of customers and usage of this product will extend over the next few years. For this reason, we invest a lot of resources in building skills, and we make sure to hire consultants who know their way around Data Factory.
The primary use case is integrating data from different ERP systems and loading it into Azure Synapse for reporting. We use Power BI for the reporting side of it. We also have customers who are migrating to Azure Data Factory and we are assisting them with making the transition.
We are using this solution to gather information from SCADA systems, analyze it using AI and machine learning, and then sending the results to our users. They receive and view the data using the Power BI interface.
I use this primarily for ETL tasks.
I primarily use the solution for my small and medium-sized clients.
We are a tech services company and this is one of the tools that we use when implementing solutions for our clients. I am currently managing a team that is working with the Azure Data Factory. Our clients that use this solution are migrating their data from on-premises to the cloud. One of our clients is building an integrated data warehouse for all of their data, using this solution. It is used to extract all of the data from different servers and store it into one place.
It's an integration platform, we migrate data across hybrid environments. We have data in our cloud environment or on-prem system so we use it for when we want to integrate data across different environments. It was a problem for us to get data from different hybrid environments.
The use cases are more related to logistics, our finance, and back-office activity.
Our primary use case is data loading.
There was a need to bring a lot of CRM and marketing data for some PNL analysis. We are connecting to the Salesforce cloud. In it, there's a specific solution in Salesforce Core CRM for the pharmaceutical industry. We are using the solution to connect to that and we are bringing in the various dimensions and transactions from that data source.
We are working on a data warehouse integration which means that I am working on some big data projects. I'm preparing data for the licensing. One of the projects is preparing data in Azure Data Lake, to run some transformation scripts, perform some ETL processing, and to fulfill the stage layer of the data warehouse. It means that I help with ETL use cases.
Used Azure Data Factory, Data Flow (private preview) and Databricks to develop data integration processes from multiple and varied external software sources to an OLTP application Azure SQL database. The tools are impressively well-integrated, allowing quick development of ETL, big data, data warehousing and machine learning solutions with the flexibility to grow and adapt to changing or enhanced requirements. I can't recommend it highly enough.