We use Dremio for financial data analytics and as a data lake. We connect Dremio with Oracle, Docker, MySQL, and utilize it for Power BI. Additionally, we use it to process data from MongoDB, although we face occasional challenges with NoSQL integration.
We have been using it to build one of our frameworks. We primarily use Dremio to create a data framework and a data queue. It's being used in combination with DBT and Databricks.
Dremio is a platform that enables you to perform high-performance queries from a data lake. It helps you manage that data in a sophisticated way. The use cases are broad, but it allows you to make extremely good use of data in a data lake. It gives you data warehouse capabilities with data lake data.
I can visualize traffic from BI and Tableau on the same page and have my tables and schema on the same page. The data link comprises everything. If I want one structure, I connect it to a big table in the hive and the data team that could read my SQL work on my tables, schemas, table structures and everything all in one place. Dermio is as good as any other Presto engine.
I have used this solution as an ETL tool to create data marks on data lakes for bridging. I have used it as a greater layer for ad-hoc queries and for some services which do not require sub-second latency to credit data from very big data lakes. I have also used it to manage simple ad-hoc queries similar to Athena, Presto or BigQuery. We do not have a large number of people using this solution because it's mainly setup as a service to service integration. We integrated a big workload when we started using Dremio and this was very expensive. The migration is still in progress. As soon as this migration is finished, we plan to migrate ad-hoc queries from our analytical team.
Dremio is a data analytics platform designed to simplify and expedite the data analysis process by enabling direct querying across multiple data sources without the need for data replication. This solution stands out due to its approach to data lake transformation, offering tools that allow users to access and query data stored in various formats and locations as if it were all in a single relational database.
At its core, Dremio facilitates a more streamlined data management experience. It...
We use Dremio for financial data analytics and as a data lake. We connect Dremio with Oracle, Docker, MySQL, and utilize it for Power BI. Additionally, we use it to process data from MongoDB, although we face occasional challenges with NoSQL integration.
We use Dremio for data engineering.
We have been using it to build one of our frameworks. We primarily use Dremio to create a data framework and a data queue. It's being used in combination with DBT and Databricks.
Dremio is a platform that enables you to perform high-performance queries from a data lake. It helps you manage that data in a sophisticated way. The use cases are broad, but it allows you to make extremely good use of data in a data lake. It gives you data warehouse capabilities with data lake data.
I can visualize traffic from BI and Tableau on the same page and have my tables and schema on the same page. The data link comprises everything. If I want one structure, I connect it to a big table in the hive and the data team that could read my SQL work on my tables, schemas, table structures and everything all in one place. Dermio is as good as any other Presto engine.
I have used this solution as an ETL tool to create data marks on data lakes for bridging. I have used it as a greater layer for ad-hoc queries and for some services which do not require sub-second latency to credit data from very big data lakes. I have also used it to manage simple ad-hoc queries similar to Athena, Presto or BigQuery. We do not have a large number of people using this solution because it's mainly setup as a service to service integration. We integrated a big workload when we started using Dremio and this was very expensive. The migration is still in progress. As soon as this migration is finished, we plan to migrate ad-hoc queries from our analytical team.