Data Integration Engineer at a retailer with 51-200 employees
Real User
2024-10-17T08:35:00Z
Oct 17, 2024
My customers use Denodo for data integration. Before using Denodo, they did not have a platform to control all data sources with one platform. Denodo allows connecting different data sources for analysis and reporting.
I use the tool to integrate and standardize data. It converts all kinds of data. It can convert non-relational data to relational data. It can create a standardized way to consume data.
I use the solution in my company as a virtualization tool. The tool helps users to connect with many databases and resources in the data lake. With the tool, I can create views for the front end of Tableau and other BI tools. I combine data from different or various resources and combine them to create views, which is also useful for analysis purposes. On top of the views I create with the tool, I also make dashboards.
Principal Architect at a manufacturing company with 10,001+ employees
Real User
Top 10
2023-05-03T12:20:00Z
May 3, 2023
We use the solution for data virtualization to publish the data as a product. We have different teams who have their own database. If a team needs any data, they get it from Denodo.
Currently, we're working for one of our industrial resource clients. It is a hospital-based company that does resources and mining worldwide. So we use Denodo to create views for multiple courses so that clients can review them within two seconds.
Senior Application Developer at a financial services firm with 10,001+ employees
Real User
2022-09-20T17:28:00Z
Sep 20, 2022
We use Denodo in our organization whenever a web service is needed quickly, or where using another technology (such as Java) would take too much time. From our standpoint, Denodo is used in such a way that any consumer can build their own web service based on their data points. There's no need to ask the provider, "I need a web service for X, Y, or Z". Instead, you simply ask, "Hey, I just need the data points". For example, for a table, all you need is to ask the consumer to provide the table name and for whatever service you need, you can build your own web service on top of it. It takes minimal effort to accomplish this with Denodo and it's an extremely quick process, especially when compared to doing the same thing with Java or any of the other technologies we are using. In fact, you can build a web service with Denodo within 10 minutes. Our web services can also be consumed via different methods where there are multiple possible levels of responses, and each web service can be duplicated to provide for each different level of response. So within 10 minutes, you can build a web service with different variations out-of-the-box. If you wanted to incorporate just one method in Java or AWS, it would take at least a day to deploy and have it set up to provide responses. But our experience with Denodo is that it only takes a few clicks and it will instantly deploy the web service, and then you are free to use that resource. As for our environment, there are two editions that we have the option of using. One being the cloud-based version, and the other is a client version which we can download onto our system. Most of our organization is using the cloud-based version, as all the patches are pushed directly to the cloud. Overall, we have about 100+ company members using Denodo, most of whom are product developers.
We use this solution as an access layer and virtualization layer. We connect to a number of sources and then use Denodo as the main access layer for consumers or for reporting. I have also personally used it for integrating sources for some of our critical reports which was a use case outside the regular virtualization capability. We calculated some of the key metrics for our critical components. I'm a data solutions architect and we are partners with Denodo.
Data Science and Business Analytics at Neusol - Global IT services
Real User
2022-09-20T09:29:02Z
Sep 20, 2022
I am using Denodo as a data integration and management tool. The concept behind Denodo is data virtualization, which makes the retrieval of data available faster. The availability of data in real-time is much faster as compared to directly accessing the data from the web.
We use Denodo to improve data governance and self-service BI. We had several data environments, one data lake, one data warehouse, and another instance of data in another silo, and we needed to deliver this data to our end users in an organized manner. It was very, powerful because we could deliver this data in a short time and with some data governance, because you deliver it through a data catalog and you have the data lineage, and you can put there the role level security, column level security, and it was very helpful for us, and the end-user was very happy to have it.
Currently, we are using it on Azure. We have around four VMs with Denodo divided into development, quality, and production environments. We have various configurations. I don't really work developing the queries on Denodo, it's more on the administration side. I have managed the scalability, the configurations, and providing the environment. Currently, we use Denodo mostly to query data from SAP. We have a very old SAP inside of Schaeffler that doesn't have encryption. To retrieve this data from other people, to use the same data, we use Denodo to make this encryption. Then we have a few databases, and we can create base views based on the SAP tables. In some use cases, they also create web services based on these tables as well. They use it a bit as an ETL tool. They use for data scouting. For example, we have one tool that was developed inside of Schaeffler called Discovery, where we used Denodo to get the metadata from basically all the different systems that we have.
Lead Solution Architect at a insurance company with 10,001+ employees
Real User
2022-09-07T17:29:00Z
Sep 7, 2022
I work with an insurance company and our main reason for using Denodo is to bring together all our data into one platform in the cloud. The company has very diverse data sources including data stored in the cloud, XML files, Db2 and SQL databases, and SAM / VSAM files on legacy mainframe platforms. Thus, management decided that they wanted all the data in one place by connecting these different data sources for better visualization and reporting. It's really working well for us and we are using it for both of our claims centers with our claims management solution as well as our premium management solution. One instance of Denodo is for the underwriting team and the other is for the actuary team. In total, we have around 45 people using it. We were originally using the solution on-premises but we are now using the cloud version deployed on Azure.
We use Denodo to fetch ad hoc reports. The users can join tables or views and fetch the reports on their own. It is not a soft controlled environment or an OCC controlled environment, so they can do whatever they need. The have some room to play.
Senior Manager Data Analytics at a recreational facilities/services company with 10,001+ employees
Real User
2022-09-23T20:24:31Z
Sep 23, 2022
We are currently using it in our company, and it's our data virtualization tool. We fall under the federated analytical model. Basically, we have data that moves into our warehouse, and from the warehouse, when we want to build our reports, we use it. So, we are mainly using it for reporting purposes at this point. Instead of building aggregated views in our warehouse, we put that on the Denodo layer. We aggregate the data in our Denodo layer. Because Denodo is a virtualization tool, we try not to cache the data, but we've cached it whenever needed. Our dashboards usually run once a day or twice a day. So, we refresh the data once or twice a day. Sometimes, we also refresh every hour, but mostly, we refresh once or twice a day. For that, we don't need the data materialized in a view. We don't need the data to be there. We just create these views on Denodo. We have Power BI and Qlik Sense, and we just use those tools to pull the data from Denodo. There are other uses as well that we are thinking of as a company. We have business users who do want to look at the backend of what's going on. They may just want to run something like a quick analysis. For example, for a question, they just want to go take a look at the actual data. For that purpose, we can give these analysts access, and they can go in and take a look at that. Eventually, we want to go to a place where we want to enable self-service analytics for our business users. We are thinking that Denodo could help us in that direction. I know different companies use Denodo for different purposes, but we are using it on the IT side. Currently, we are using it mainly for reporting purposes, but we could also move towards self-service and make it available to our business users as well. That's for the future. We haven't done that yet. For now, we are only doing it at the BI layer.
I work for a company that provides financial solutions to banks and advisors. We need to send data everyday to different third parties in different file formats, like pipe delimited, comma separated, fixed width, xml, etc. Previously, we used to generate those files with the data using PL/SQL. After looking at the performance issues and the complexity, we started using Denodo. We run scheduler jobs in Denodo, and the job will in turn run the VQLs. VQLs will gather the data based on the business rules and business logic. Finally, it will spool the data in a predefined file format, and then the file will be delivered to the respective recipient. Basically, we generate interfaces using Denodo.
Denodo can be deployed on-premise or in the cloud. Denodo is used mostly for creating the views and for cache enablement for remote table creations and for pushing the view into the target DBs. The solution can extract data from multiple sources, schedule, create the views, and push them to the target.
Senior BI Developer at a tech vendor with 51-200 employees
Real User
2021-12-22T18:57:00Z
Dec 22, 2021
I am currently a Senior BI Developer. We are partners with Denodo and market to our customers as a solution to migrate data virtually. Customers have an average of 50 users. Our customers use Denodo to connect to multiple sources and in the retail process.
My primary use case is for data virtualization. I'm working in the pharma domain, so there are large amounts of data coming in from different sources, which I aggregate into Azure SQL, some other web services, and SAP applications like CRM, POS, and others. Denodo acts as a virtualization layer, where we are collecting and creating views for analytical purposes. So we use Denodo to integrate and transform. It is deployed on-premises.
Solution Architect at a tech services company with 1-10 employees
Real User
2021-03-31T14:26:04Z
Mar 31, 2021
Our primary use case of this product is for data virtualization and now they have a pilot project for a very small installation, which they might want to extend in the future. We are partners with Denodo and I'm a solutions architect.
Data and Analytics Strategist at Climate Action Services Inc.
Real User
2020-11-27T17:21:18Z
Nov 27, 2020
I am a consultant for an organization. We are using this solution for data anonymization, and then the particular use case that's of an advantage is the chain of custody of data. That is something that is really critical today, in breaking down that barrier. Data can remain where it is and still be shared.
Deputy General Manager at a comms service provider with 5,001-10,000 employees
Real User
2020-09-27T04:10:11Z
Sep 27, 2020
We primarily use the product for data virtualization and visualization. We actually have it integrated with some other virtualization tools as well. We design the systems and hand them over to our customers.
ETL/BI Senior Consultant at a consultancy with 51-200 employees
MSP
2020-04-26T06:32:40Z
Apr 26, 2020
We're building a case for data virtualization for people to use this solution in order to assist in problem solving. We don't sell the product, we partner with Denodo and I am an ETL and senior consultant.
Head of Data Service Department at a government with 201-500 employees
Real User
2019-08-25T05:17:00Z
Aug 25, 2019
We use an entirely on-premises deployment model. Our primary use case of this solution is for the virtualization or unification of a data interface that produces national statistics.
Denodo is a leading data integration, management, and delivery platform that uses a logical approach to enable data science, hybrid and multi-cloud data integration, self-service BI, and enterprise data services. Organizations of different sizes across various industries utilize the product to get above the data silos. The solution offers organizations the freedom to migrate data to the cloud, or logically unify data warehouses and data lakes, without affecting business. This can ultimately...
My customers use Denodo for data integration. Before using Denodo, they did not have a platform to control all data sources with one platform. Denodo allows connecting different data sources for analysis and reporting.
I use the tool to integrate and standardize data. It converts all kinds of data. It can convert non-relational data to relational data. It can create a standardized way to consume data.
I use the solution in my company as a virtualization tool. The tool helps users to connect with many databases and resources in the data lake. With the tool, I can create views for the front end of Tableau and other BI tools. I combine data from different or various resources and combine them to create views, which is also useful for analysis purposes. On top of the views I create with the tool, I also make dashboards.
We use the solution for data virtualization to publish the data as a product. We have different teams who have their own database. If a team needs any data, they get it from Denodo.
Denodo is a tool used for data virtualization. Our company provides services to our customers. One of our customers is on the Denodo platform.
Currently, we're working for one of our industrial resource clients. It is a hospital-based company that does resources and mining worldwide. So we use Denodo to create views for multiple courses so that clients can review them within two seconds.
We use Denodo in our organization whenever a web service is needed quickly, or where using another technology (such as Java) would take too much time. From our standpoint, Denodo is used in such a way that any consumer can build their own web service based on their data points. There's no need to ask the provider, "I need a web service for X, Y, or Z". Instead, you simply ask, "Hey, I just need the data points". For example, for a table, all you need is to ask the consumer to provide the table name and for whatever service you need, you can build your own web service on top of it. It takes minimal effort to accomplish this with Denodo and it's an extremely quick process, especially when compared to doing the same thing with Java or any of the other technologies we are using. In fact, you can build a web service with Denodo within 10 minutes. Our web services can also be consumed via different methods where there are multiple possible levels of responses, and each web service can be duplicated to provide for each different level of response. So within 10 minutes, you can build a web service with different variations out-of-the-box. If you wanted to incorporate just one method in Java or AWS, it would take at least a day to deploy and have it set up to provide responses. But our experience with Denodo is that it only takes a few clicks and it will instantly deploy the web service, and then you are free to use that resource. As for our environment, there are two editions that we have the option of using. One being the cloud-based version, and the other is a client version which we can download onto our system. Most of our organization is using the cloud-based version, as all the patches are pushed directly to the cloud. Overall, we have about 100+ company members using Denodo, most of whom are product developers.
We use this solution as an access layer and virtualization layer. We connect to a number of sources and then use Denodo as the main access layer for consumers or for reporting. I have also personally used it for integrating sources for some of our critical reports which was a use case outside the regular virtualization capability. We calculated some of the key metrics for our critical components. I'm a data solutions architect and we are partners with Denodo.
I am using Denodo as a data integration and management tool. The concept behind Denodo is data virtualization, which makes the retrieval of data available faster. The availability of data in real-time is much faster as compared to directly accessing the data from the web.
We use Denodo to improve data governance and self-service BI. We had several data environments, one data lake, one data warehouse, and another instance of data in another silo, and we needed to deliver this data to our end users in an organized manner. It was very, powerful because we could deliver this data in a short time and with some data governance, because you deliver it through a data catalog and you have the data lineage, and you can put there the role level security, column level security, and it was very helpful for us, and the end-user was very happy to have it.
Currently, we are using it on Azure. We have around four VMs with Denodo divided into development, quality, and production environments. We have various configurations. I don't really work developing the queries on Denodo, it's more on the administration side. I have managed the scalability, the configurations, and providing the environment. Currently, we use Denodo mostly to query data from SAP. We have a very old SAP inside of Schaeffler that doesn't have encryption. To retrieve this data from other people, to use the same data, we use Denodo to make this encryption. Then we have a few databases, and we can create base views based on the SAP tables. In some use cases, they also create web services based on these tables as well. They use it a bit as an ETL tool. They use for data scouting. For example, we have one tool that was developed inside of Schaeffler called Discovery, where we used Denodo to get the metadata from basically all the different systems that we have.
Our ETL team was preparing the data warehouse, and we used Denodo as an intermediate solution.
I work with an insurance company and our main reason for using Denodo is to bring together all our data into one platform in the cloud. The company has very diverse data sources including data stored in the cloud, XML files, Db2 and SQL databases, and SAM / VSAM files on legacy mainframe platforms. Thus, management decided that they wanted all the data in one place by connecting these different data sources for better visualization and reporting. It's really working well for us and we are using it for both of our claims centers with our claims management solution as well as our premium management solution. One instance of Denodo is for the underwriting team and the other is for the actuary team. In total, we have around 45 people using it. We were originally using the solution on-premises but we are now using the cloud version deployed on Azure.
We use Denodo to fetch ad hoc reports. The users can join tables or views and fetch the reports on their own. It is not a soft controlled environment or an OCC controlled environment, so they can do whatever they need. The have some room to play.
We are currently using it in our company, and it's our data virtualization tool. We fall under the federated analytical model. Basically, we have data that moves into our warehouse, and from the warehouse, when we want to build our reports, we use it. So, we are mainly using it for reporting purposes at this point. Instead of building aggregated views in our warehouse, we put that on the Denodo layer. We aggregate the data in our Denodo layer. Because Denodo is a virtualization tool, we try not to cache the data, but we've cached it whenever needed. Our dashboards usually run once a day or twice a day. So, we refresh the data once or twice a day. Sometimes, we also refresh every hour, but mostly, we refresh once or twice a day. For that, we don't need the data materialized in a view. We don't need the data to be there. We just create these views on Denodo. We have Power BI and Qlik Sense, and we just use those tools to pull the data from Denodo. There are other uses as well that we are thinking of as a company. We have business users who do want to look at the backend of what's going on. They may just want to run something like a quick analysis. For example, for a question, they just want to go take a look at the actual data. For that purpose, we can give these analysts access, and they can go in and take a look at that. Eventually, we want to go to a place where we want to enable self-service analytics for our business users. We are thinking that Denodo could help us in that direction. I know different companies use Denodo for different purposes, but we are using it on the IT side. Currently, we are using it mainly for reporting purposes, but we could also move towards self-service and make it available to our business users as well. That's for the future. We haven't done that yet. For now, we are only doing it at the BI layer.
I mainly use Denodo for data modification.
I work for a company that provides financial solutions to banks and advisors. We need to send data everyday to different third parties in different file formats, like pipe delimited, comma separated, fixed width, xml, etc. Previously, we used to generate those files with the data using PL/SQL. After looking at the performance issues and the complexity, we started using Denodo. We run scheduler jobs in Denodo, and the job will in turn run the VQLs. VQLs will gather the data based on the business rules and business logic. Finally, it will spool the data in a predefined file format, and then the file will be delivered to the respective recipient. Basically, we generate interfaces using Denodo.
We have around 100 people using this solution in my company. The solution is deployed on cloud and on-premises.
We mainly use this solution to virtualize items that are stored on old node servers.
Denodo can be deployed on-premise or in the cloud. Denodo is used mostly for creating the views and for cache enablement for remote table creations and for pushing the view into the target DBs. The solution can extract data from multiple sources, schedule, create the views, and push them to the target.
I am currently a Senior BI Developer. We are partners with Denodo and market to our customers as a solution to migrate data virtually. Customers have an average of 50 users. Our customers use Denodo to connect to multiple sources and in the retail process.
My primary use case is for data virtualization. I'm working in the pharma domain, so there are large amounts of data coming in from different sources, which I aggregate into Azure SQL, some other web services, and SAP applications like CRM, POS, and others. Denodo acts as a virtualization layer, where we are collecting and creating views for analytical purposes. So we use Denodo to integrate and transform. It is deployed on-premises.
Our primary use case of this product is for data virtualization and now they have a pilot project for a very small installation, which they might want to extend in the future. We are partners with Denodo and I'm a solutions architect.
I am a consultant and I use Denodo for my clients.
I am a consultant for an organization. We are using this solution for data anonymization, and then the particular use case that's of an advantage is the chain of custody of data. That is something that is really critical today, in breaking down that barrier. Data can remain where it is and still be shared.
We primarily use the product for data virtualization and visualization. We actually have it integrated with some other virtualization tools as well. We design the systems and hand them over to our customers.
We're building a case for data virtualization for people to use this solution in order to assist in problem solving. We don't sell the product, we partner with Denodo and I am an ETL and senior consultant.
It was used for an integration tool setup to follow and as temporary housing of integration lens.
We primarily use the solution with Office 360 for cloud migration as well as connecting to data links.
We use an entirely on-premises deployment model. Our primary use case of this solution is for the virtualization or unification of a data interface that produces national statistics.