We use this solution for finding anomalies and applying the rules to the streaming data.
There are around 50 people using this solution in my organization, including data scientists.
We use this solution for finding anomalies and applying the rules to the streaming data.
There are around 50 people using this solution in my organization, including data scientists.
The ability to stream data and the windowing feature are valuable. There are a number of targeted integration points, so that is a difference between Stream Analytics and Databricks. The integrations input or output are better in Databricks. It's accessible to use any of the Python or even Java. I can use the third party, deploy it, and use it.
Support for Microsoft technology and the compatibility with the .NET framework is somewhat missing. There should be reliability between these two. Databricks is based on open sources. If it's more synchronous between the Microsoft technology and the programming languages, it'll be better. Python has better languages, but compatibility would be a great help.
I would like to have better support for Microsoft technology and better language components.
With Azure or Cosmo DB, I can store other data links or time series data tables. That would be a great help for analytics in real time.
I have been using Databricks for eight months.
The scalability is fine. We had thousands of devices and were sending data infrequently, so that worked for us. If the amount increases, the windowing function and job schedule may not perform as expected.
I would rate technical support 4 out of 5. We had some issues with setup, and they were finally solved but it was after following up a few times.
Azure Stream Analytics is easy to use and easy to deploy. It's a little bit better. Databricks is still having some stability issues. Azure Stream Analytics has a few input and output sources, and it's scalable to all types of third party or interfaces.
Setup was complex. There were some issues with setting up a database and installing the third party component on top of services. I would rate the setup 3 out of 5.
Implementation was done in-house.
The cost is around $600,000 for 50 users.
I would rate the price 2 out of 5.
I would rate this solution 8 out of 10.
We use the solution for reliability engineering, where we apply ML and Deep Learning models to identify the fear failure patterns across different geographies and products.
Databricks is hosted on the cloud. It is very easy to collaborate with other team members who are working on it. It is production-ready code, and scheduling the jobs is easy.
Databricks would have more collaborative features than it has. It should have some more customization for the jobs. Also, it has an average dashboarding tool. They can bring advanced features so we don't depend on other BI tools to build a dashboard. We are using Tableau to create a dashboard. If Databricks has more advanced features, we can entirely use Databricks.
I have been using Databricks for one year.
The product is stable. It has been giving consistent outputs without any major issues.
The solution is hosted on the cloud. It supports high scalability features.
10-20 users are using this solution.
There was a training session from Databricks where they explained how to use it. We never had to contact them because they had already given us proper training on the platform.
I have used Alteryx before. We switched to Databricks because it can compute and turn your code into production-ready code in very few seconds. Also, the stability is relatively high.
The initial setup is easy.
We have a dedicated team for the deployment.
Delta Lake is a free system. We practically work on the data that we get from Snowflake. Databricks are returned to the model outputs that are returned to Delta Lake. It is easy for us to collaborate using Delta Lake, and the computation speed is also quite high for Delta Lake.
The learning curve for Databricks is not very steep. It's pretty easy, and you will find a lot of materials online. So, if you are comfortable coding in Python, it's very straightforward. There is nothing to worry about when using Databricks.
Overall, I rate the solution a ten out of ten.
We are using Databricks for machine learning workloads specifically.
Databricks aligns well with our skillset and overall approach. We sought out their solution specifically for a big data application we are currently working on, as we needed a platform capable of handling large amounts of data and building models. Additionally, the fact that they use open-source software and can integrate data warehouse and data lake systems was particularly appealing, as we have encountered such issues in the past. We determined that Databricks would be an effective solution for our needs.
The most valuable feature of Databricks is the integration of the data warehouse and data lake, and the development of the lake house. Additionally, it integrates well with Spark for processing data in production.
The solution could be improved by adding a feature that would make it more user-friendly for our team. The feature is simple, but it would be useful. Currently, our team is more familiar with the language R, but Databricks requires the use of Jupyter Notebooks which primarily supports Python. We have tried using RStudio, but it is not a fully integrated solution. To fully utilize Databricks, we have to use the Jupyter interface. One feature that would make it easier for our team to adopt the Jupyter interface would be the ability to select a specific variable or line of code and execute it within a cell. This feature is available in other Jupyter Notebooks outside of Databricks and in our own IDE, but it is not currently available within Databricks. If this feature were added, it would make the transition to using Databricks much smoother for our team.
The most important feature other than the Jupyter interface would be to have the RStudio interface inside Databricks. This would be perfect.
We have been using Databricks for approximately one year.
The stability of Databricks is good.
I rate the stability of Databricks a nine out of ten.
Databricks is scalable.
I rate the scalability of Databricks a nine out of ten.
I have been receiving responsive answers from Databricks's support. I have been pleased with the support.
I rate the support from Databricks a ten out of ten.
Positive
The initial setup of Databricks is simple. I did not experience any challenges. The time it takes for the deployment is approximately four hours.
I rate the initial setup of Databricks.
We did the deployment of the solution in-house. There were three people involved in the deployment. A data engineer, data analyst, and machine learning engineer.
We have only incurred the cost of our AWS cloud services. This is because during this period, Databricks provided us with an extended evaluation period, and we have not spent much money yet. We are just starting to incur costs this month, I will know more later on the full cost perspective.
We only pay standard fees for the solution.
We use a data engineer, data analyst, and machine learning engineer for the maintenance of the solution.
I rate Databricks a nine out of ten.
I would like to see the integration between Databricks and MLflow improved. It is quite hard to train multiple models in parallel in the distributed fashions. You hit rate limits on the clients very fast.
I have been using Databricks for three years.
I would rate the stability of this solution a nine out of 10, with one being not stable and 10 being very stable.
I would rate the scalability of this solution an eight out of 10, with one being not scalable and 10 being very scalable.
There are three people using this solution in our organization.
I would rate the available customer service a three. It's worth mentioning that this is Microsoft and not Databricks itself. I haven't spoken to Databricks people directly, but I know the people who have and they have been a lot more pleased.
Negative
I would rate their pricing plan a six (on a scale of one to 10, with one being cheap and 10 being expensive). I think the prices could be lowered a little bit.
Overall, I would rate this solution an eight out of 10, with one being quite poor and 10 being excellent. It is fast, it's scalable, and it does the job it needs to do.
We use this solution for the Customer Data Platform(CDP). My company works in the MarTech space and usually we implement custom CDP.
The Delta Lake data type has been the most useful part of this solution. Delta Lake is an opensource data type and it was implemented and invented by Databricks. It is the most important element of the solution. Databricks also offers exceptional performance and scalability.
The data visualization for this solution could be improved. They have started to roll out a data visualization tool inside Databricks but it is in the early stages. It's not comparable to a solution like Power BI, Luca, or Tableau.
In a future release, we would like to have a better ETL designer tool to assist in the way we move data from one place to another.
We have been using this solution for four years.
This is a stable solution.
This is a scalable solution.
The initial setup is very easy. It is a managed solution inside Azure so you just need to search for Databricks. There are a couple of pages to follow in the setup wizard and Databricks is up and running.
We implement this solution on behalf of our customers who have their own Azure subscription and they pay for Databricks themselves. The pricing is more expensive if you have large volumes of data.
When we first started using Databricks in 2018, there were not many comarable solutions to consider. Right now there are many solutions to consider including Snowflake, Azure Synapse, Redshift and BigQuery.
Databricks continues to be our solution of choice but Snowflake does have a better user interface and is easier to work with the data pipelines and with the overall UI.
I would advise others to first define a strong data strategy and then choose which data platform suits your needs.
I would rate this solution a nine out of ten.
We build data solutions for the banking industry. Previously, we worked with AWS, but now we are on Azure. My role is to assess the current legacy applications and provide cloud alternatives based on the customers' requirements and expectations.
Databricks is a unified platform that provides features like streaming and batch processing. All the data scientists, analysts, and engineers can collaborate on a single platform. It has all the features, you need, so you don't need to go for any other tool.
I like that Databricks is a unified platform that lets you do streaming and batch processing in the same place. You can do analytics, too. They have added something called Databricks SQL Analytics, allowing users to connect to the data lake to perform analytics. Databricks also will enable you to share your data securely. It integrates with your reporting system as well.
The Unity Catalog provides you with the data links and material capabilities. These are some of the unique features that fulfill all the requirements of the banking domain.
Every tool has room for improvement. Normally what happens, a solution will claim it can do ETL and everything else, but you encounter some limitations when you actually start. Then you keep on interacting with the vendor, and they continue to upgrade it. For example, we haven't fully implemented Databricks Unity Catalog, a newly introduced feature. We need to check how it works and then accordingly, there can be improvements in that also.
Databricks may not be as easy to use as other tools, but if you simplify a tool too much, it won't have the flexibility to go in-depth. Databricks is completely in the programmer's hands. I prefer flexibility rather than simplicity.
I have been using Databricks for a year.
Databricks relies on scalability and performance. Every cloud vendor prioritizes scalability, high availability, performance, and security. These are the most important reasons to move to the cloud.
Deploying Databricks on the cloud is straightforward. It's not like an on-premise solution, where you must create a cluster and all those other prerequisites for big data.
I don't think it's challenging to maintain, but you need an expert programmer because Databricks isn't GUI-based. With GUI-based tools, building ETLs is drag-and-drop. Databricks entirely relies on coding, so you need skilled programmers to building your code, ETLs, etc.
The price of Databricks is based on the computing volume. You also need to pay storage costs for the cloud where you're hosting Databricks, whether it is AWS, Azure, or Google.
I rate Databricks nine out of 10. Databricks is one of the best tools on the market.
We use Databricks for batch data processing and stream data processing.
Databricks provides a consistent interface for data engineers to work with data in a consistent language on a single integrated platform for ingesting, processing, and serving data to the end user.
The flexibility of Databricks is the most valuable feature. It gives us the ability to write analytics code in multiple languages.
There is a single workspace for different data roles like data engineers, machine learning engineers, and the end user, who can connect to the same system.
Databricks computes separate from storage, so you are not coupled with the underlying data sets, allowing for multiple processes and multiple programs to be written on the same code.
I would like to see improvement with the UI. It is functional and useful, but it's a bit clunky at times. It should be more user-friendly.
In future releases, Databricks would benefit from enhanced metrics and tighter integration with Azure's diagnostics.
I have been using Databricks for eight months.
Databricks is very stable.
The scalability of this solution is good. In our organization, users include analysts, data engineers, and data scientists.
I would give Databrick service and support a four and a half out of five overall.
Positive
Prior to using Databricks, we used Azure Stream Analytics. We made the switch because of the scalability and integrated platform.
The initial setup of Databricks is more complex. I would rate it a four out of five on the complexity of the setup. It took two days to deploy the solution.
We used a third party for some of the implementations of Databricks. The number of staff required to deploy and maintain this solution depends on the number of processes you have. Due to the cloud nature of the technology, it is easy to deploy and maintain.
The licensing of Databricks is a tiered licensing regime, so it is flexible. I feel their pricing is a five out of five.
Databricks is a one-stop shop for everything data related, and it can scale with you.
I would rate this solution a 9.5 out of 10 overall.
We use Databricks to define tool data and have many use cases to analyze and distribute the data.
Data is open to everyone; they can access it through many channels, including notebooks or SQL. That on its own democratizes the data.
I like cloud scalability and data access for any type of user.
It would be better if it were faster. It can be slow, and it can be super fast for big data. But for small data, sometimes there is a sub-second response, which can be considered slow.
In the next release, I would like to have automatic creation of APIs because they don't have it at the moment, and I spend a lot of time building them.
I have been using Databricks for roughly one and a half years.
Stability is excellent.
Databricks is scalable. You can use the power of the cloud to scale your cluster size, either CPU or memory. The data doesn't work like a standard database, so you don't have it based on files, and you don't copy the data. It's super scalable. It's only the computing that you have to scale with the data.
We probably have 40 users with roles like developers, business analysts, and data scientists. We have big plans to increase the usage and have more departments using it.
Technical support has helped us.
On a scale from one to ten, I would give technical support a five.
Positive
We used Cloudera before switching to Databricks.
The initial setup was fairly okay. It takes about two minutes to deploy this solution. It's all code, so we click a button, and then it's done.
On a scale from one to five, I would give the initial setup a four.
We set up and deployed this solution.
On a scale from one to five, I would give our ROI a three.
We only pay for the Azure compute behind the solution. If you want to compute, you have to have a database layer and Azure below.
On a scale from one to five, I would give their pricing a two.
We looked at other options such as Snowflake and Cloudera on the cloud,
I would tell potential users that they need proper cloud engineers and a
cloud infrastructure team to use this solution.
On a scale from one to ten, I would give Databricks a nine.