Solution Architect at a computer software company with 1,001-5,000 employees
Real User
Top 20
2024-03-19T17:22:00Z
Mar 19, 2024
The tool's most valuable features are its connectors. It has many out-of-the-box connectors. We use ADF for ETL processes. Our main use case involves integrating data from various databases, processing it, and loading it into the target database. ADF plays a crucial role in orchestrating these ETL workflows.
The workflow automation features in GitLab, particularly its low code/no code approach, are highly beneficial for accelerating development speed. This feature allows for quick creation of pipelines and offers customization options for integration needs, making it versatile for various use cases. GitLab supports a wide range of connectors, catering to a majority of integration needs. Azure Data Factory's virtual enterprise and monitoring capabilities, the visual interface of GitLab makes it user-friendly and easy to teach, facilitating adoption within teams. While the monitoring capabilities are sufficient out of the box, they may not be as comprehensive as dedicated enterprise monitoring tools. GitLab's monitoring features are manageable for production use, with the option to integrate log analytics or create custom dashboards if needed.
The data flow feature in Azure Data Factory within GitLab is valuable for data transformation tasks, especially for those who may not have expertise in writing complex code. It simplifies the process of data manipulation and is particularly useful for individuals unfamiliar with Spark coding. While there could be improvements for more flexibility, overall, the data flow feature effectively accomplishes its purpose within GitLab's ecosystem.
Data Governance/Data Engineering Manager at National Bank of Fujairah PJSC
Real User
Top 10
2024-03-06T16:25:15Z
Mar 6, 2024
I like its integration with SQL pools, its ability to work with Databricks, its pipelines, and the serverless architecture are the most effective features.
Integration Solutions Lead | Digital Core Transformation Service Line at Hexaware Technologies Limited
Vendor
Mar 2, 2023
In my View ADF is one of the best tools for data-driven integrations and workflows to orchestrate the data movement in Hybrid (On-Prim & Cloud) as well as Cloud environment.
Here is my most liked features:
Security: ADF's data in-transit encryption between on-premises and cloud sources makes it one of the most secure platform for data integration.
Scalability: ADF is designed to handle large volume of data, due to its built-in parallelism and time-slicing features. You can move gigabytes of data into the cloud in a matter of a few hours.
Low /No Coding: The Graphical flow development in ADF V2, that help creating components from the Azure Portal interactively, without much coding.
Pricing: ADF pay-as-you-go pricing with no upfront cost make it most cost-efficient data integration solution. Not to mention, you can always disable a particular flow if not in use to save cost.
It's cloud-based, allowing multiple users to easily access the solution from the office or remote locations. I like that we can set up the security protocols for IP addresses, like allow lists. It's a pretty user-friendly product as well. The interface and build environment where you create pipelines are easy to use. It's straightforward to manage the digital transformation pipelines we build.
It is very modular. It works well. We've used Data Factory and then made calls to libraries outside of Data Factory to do things that it wasn't optimized to do, and it worked really well. It is obviously proprietary in regards to Microsoft created it, but it is pretty easy and direct to bring in outside capabilities into Data Factory.
.NET Architect at a computer software company with 10,001+ employees
Real User
2021-04-17T15:16:14Z
Apr 17, 2021
I think it makes it very easy to understand what data flow is and so on. You can leverage the user interface to do the different data flows, and it's great. I like it a lot.
Director at a tech services company with 1-10 employees
Real User
2020-12-09T10:31:00Z
Dec 9, 2020
Azure Data Factory's most valuable features are the packages and the data transformation that it allows us to do, which is more drag and drop, or a visual interface. So, that eases the entire process.
Azure Technical Architect at Hexaware Technologies Limited
Vendor
2019-12-31T09:39:00Z
Dec 31, 2019
From my experience so far, the best feature is the ability to copy data to any environment. We have 100 connects and we can connect them to the system and copy the data from its respective system to any environment. That is the best feature.
Sr. Technology Architect at Larsen & Toubro Infotech Ltd.
Real User
2019-12-09T10:58:00Z
Dec 9, 2019
Data Factory itself is great. It's pretty straightforward. You can easily add sources, join and lookup information, etc. The ease of use is pretty good.
This solution will allow the organisation to improve its existing data offerings over time by adding predictive analytics, data sharing via APIs and other enhancements readily.
Azure Data Factory efficiently manages and integrates data from various sources, enabling seamless movement and transformation across platforms. Its valuable features include seamless integration with Azure services, handling large data volumes, flexible transformation, user-friendly interface, extensive connectors, and scalability. Users have experienced improved team performance, workflow simplification, enhanced collaboration, streamlined processes, and boosted productivity.
The tool's most valuable features are its connectors. It has many out-of-the-box connectors. We use ADF for ETL processes. Our main use case involves integrating data from various databases, processing it, and loading it into the target database. ADF plays a crucial role in orchestrating these ETL workflows.
The workflow automation features in GitLab, particularly its low code/no code approach, are highly beneficial for accelerating development speed. This feature allows for quick creation of pipelines and offers customization options for integration needs, making it versatile for various use cases. GitLab supports a wide range of connectors, catering to a majority of integration needs. Azure Data Factory's virtual enterprise and monitoring capabilities, the visual interface of GitLab makes it user-friendly and easy to teach, facilitating adoption within teams. While the monitoring capabilities are sufficient out of the box, they may not be as comprehensive as dedicated enterprise monitoring tools. GitLab's monitoring features are manageable for production use, with the option to integrate log analytics or create custom dashboards if needed.
The data flow feature in Azure Data Factory within GitLab is valuable for data transformation tasks, especially for those who may not have expertise in writing complex code. It simplifies the process of data manipulation and is particularly useful for individuals unfamiliar with Spark coding. While there could be improvements for more flexibility, overall, the data flow feature effectively accomplishes its purpose within GitLab's ecosystem.
The scalability of the product is impressive.
I like its integration with SQL pools, its ability to work with Databricks, its pipelines, and the serverless architecture are the most effective features.
The most valuable aspect is the copy capability.
I can do everything I want with SSIS and Azure Data Factory.
It makes it easy to collect data from different sources.
We have been using drivers to connect to various data sets and consume data.
Data Factory's best features are simplicity and flexibility.
UI is easy to navigate and I can retrieve VTL code without knowing in-depth coding languages.
The data factory agent is quite good and programming or defining the value of jobs, processes, and activities is easy.
We haven't had any issues connecting it to other products.
I am one hundred percent happy with the stability.
Nothing to install on the client side and lots of adapters built in.
You do NOT want to ask what I don't like about it ;-)
In my View ADF is one of the best tools for data-driven integrations and workflows to orchestrate the data movement in Hybrid (On-Prim & Cloud) as well as Cloud environment.
Here is my most liked features:
Security: ADF's data in-transit encryption between on-premises and cloud sources makes it one of the most secure platform for data integration.
Scalability: ADF is designed to handle large volume of data, due to its built-in parallelism and time-slicing features. You can move gigabytes of data into the cloud in a matter of a few hours.
Low /No Coding: The Graphical flow development in ADF V2, that help creating components from the Azure Portal interactively, without much coding.
Pricing: ADF pay-as-you-go pricing with no upfront cost make it most cost-efficient data integration solution. Not to mention, you can always disable a particular flow if not in use to save cost.
An excellent tool for pipeline orchestration.
We have found the bulk load feature very valuable.
The solution includes a feature that increases the number of processors used which makes it very powerful and adds to the scalability.
Azure Data Factory became more user-friendly when data-flows were introduced.
Data Factory's best features include its data source connections, GUI for building data pipelines, and target loading within Azure.
Data Factory's best features are connectivity with different tools and focusing data ingestion using pipeline copy data.
This solution has provided us with an easier, and more efficient way to carry out data migration tasks.
The trigger scheduling options are decently robust.
It's cloud-based, allowing multiple users to easily access the solution from the office or remote locations. I like that we can set up the security protocols for IP addresses, like allow lists. It's a pretty user-friendly product as well. The interface and build environment where you create pipelines are easy to use. It's straightforward to manage the digital transformation pipelines we build.
The solution is okay.
The most valuable feature of Azure Data Factory is that it has a good combination of flexibility, fine-tuning, automation, and good monitoring.
I enjoy the ease of use for the backend JSON generator, the deployment solution, and the template management.
I like that it's a monolithic data platform. This is why we propose these solutions.
Microsoft supported us when we planned to provision Azure Data Factory over a private link. As a result, we received excellent support from Microsoft.
The most valuable feature of this solution would be ease of use.
The two most valuable features of Azure Data Factory are that it's very scalable and that it's also highly reliable.
The overall performance is quite good.
It's extremely consistent.
In terms of my personal experience, it works fine.
It is very modular. It works well. We've used Data Factory and then made calls to libraries outside of Data Factory to do things that it wasn't optimized to do, and it worked really well. It is obviously proprietary in regards to Microsoft created it, but it is pretty easy and direct to bring in outside capabilities into Data Factory.
The most valuable feature is the copy activity.
I think it makes it very easy to understand what data flow is and so on. You can leverage the user interface to do the different data flows, and it's great. I like it a lot.
The initial setup is very quick and easy.
The solution can scale very easily.
The best part of this product is the extraction, transformation, and load.
Azure Data Factory's most valuable features are the packages and the data transformation that it allows us to do, which is more drag and drop, or a visual interface. So, that eases the entire process.
It has built-in connectors for more than 100 sources and onboarding data from many different sources to the cloud environment.
The flexibility that Azure Data Factory offers is great.
It is easy to integrate.
The security of the agent that is installed on-premises is very good.
The most valuable feature is the ease in which you can create an ETL pipeline.
On the tool itself, we've never experienced any bugs or glitches. There haven't been crashes. Stability has been good.
It is easy to deploy workflows and schedule jobs.
From my experience so far, the best feature is the ability to copy data to any environment. We have 100 connects and we can connect them to the system and copy the data from its respective system to any environment. That is the best feature.
It is a complete ETL Solution.
Powerful but easy-to-use and intuitive.
The most valuable features are data transformations.
Data Factory itself is great. It's pretty straightforward. You can easily add sources, join and lookup information, etc. The ease of use is pretty good.
From what we have seen so far, the solution seems very stable.
The user interface is very good. It makes me feel very comfortable when I am using the tool.
The solution has a good interface and the integration with GitHub is very useful.
This solution will allow the organisation to improve its existing data offerings over time by adding predictive analytics, data sharing via APIs and other enhancements readily.