Financial Analyst 4 (Supply Chain & Financial Analytics) at Juniper Networks
MSP
Top 5
2024-03-28T09:56:00Z
Mar 28, 2024
Databricks is hosted on the cloud. It is very easy to collaborate with other team members who are working on it. It is production-ready code, and scheduling the jobs is easy.
Specifically for data science and data analytics purposes, it can handle large amounts of data in less time. I can compare it with Teradata. If a job takes five hours with Teradata databases, Databricks can complete it in around three to three and a half hours.
Databricks makes it really easy to use a number of technologies to do data analysis. In terms of languages, we can use Scala, Python, and SQL. Databricks enables you to run very large queries, at a massive scale, within really good timeframes.
Principal at a computer software company with 5,001-10,000 employees
Real User
Top 20
2022-12-16T18:28:24Z
Dec 16, 2022
What I like about Databricks is that it's one of the most popular platforms that give access to folks who are trying not just to do exploratory work on the data but also go ahead and build advanced modeling and machine learning on top of that.
Databricks has improved my organization by allowing us to transform data from sources to a different format and feed that to the analytics, business intelligence, and reporting teams. This tool makes it easy to do those kinds of things.
Head of Business Integration and Architecture at Jakala
Real User
2022-10-21T13:43:56Z
Oct 21, 2022
The Delta Lake data type has been the most useful part of this solution. Delta Lake is an opensource data type and it was implemented and invented by Databricks.
Associate Principal - Data Engineering at LTI - Larsen & Toubro Infotech
Real User
2022-07-17T09:50:00Z
Jul 17, 2022
I like that Databricks is a unified platform that lets you do streaming and batch processing in the same place. You can do analytics, too. They have added something called Databricks SQL Analytics, allowing users to connect to the data lake to perform analytics. Databricks also will enable you to share your data securely. It integrates with your reporting system as well.
Databricks is a unified solution that we can use for streaming. It is supporting open source languages, which are cloud-agnostic. When I do database coding if any other tool has a similar language pack to Excel or SQL, I can use the same knowledge, limiting the need to learn new things. It supports a lot of Python libraries where I can use some very easily.
Manager, Customer Journey at a retailer with 10,001+ employees
Real User
2022-05-18T14:11:55Z
May 18, 2022
I like how easy it is to share your notebook with others. You can give people permission to read or edit. I think that's a great feature. You can also pull in code from GitHub pretty easily. I didn't use it that often, but I think that's a cool feature.
Director - Data Engineering expert at Sankir Technologies
Real User
2022-03-18T16:14:27Z
Mar 18, 2022
Databricks has a scalable Spark cluster creation process. The creators of Databricks are also the creators of Spark, and they are the industry leaders in terms of performance.
Machine Learning Engineer at a tech vendor with 51-200 employees
Real User
2019-12-25T08:21:00Z
Dec 25, 2019
The most valuable aspect of the solution is its notebook. It's quite convenient to use, both terms of the research and the development and also the final deployment, I can just declare the spark jobs by the load tables. It's quite convenient.
Databricks is utilized for advanced analytics, big data processing, machine learning models, ETL operations, data engineering, streaming analytics, and integrating multiple data sources.
Organizations leverage Databricks for predictive analysis, data pipelines, data science, and unifying data architectures. It is also used for consulting projects, financial reporting, and creating APIs. Industries like insurance, retail, manufacturing, and pharmaceuticals use Databricks for data...
Databricks is hosted on the cloud. It is very easy to collaborate with other team members who are working on it. It is production-ready code, and scheduling the jobs is easy.
It is a cost-effective solution.
Databricks has helped us have a good presence in data.
Specifically for data science and data analytics purposes, it can handle large amounts of data in less time. I can compare it with Teradata. If a job takes five hours with Teradata databases, Databricks can complete it in around three to three and a half hours.
The initial setup phase of Databricks was good.
The processing capacity is tremendous in the database.
The solution is very simple and stable.
Databricks makes it really easy to use a number of technologies to do data analysis. In terms of languages, we can use Scala, Python, and SQL. Databricks enables you to run very large queries, at a massive scale, within really good timeframes.
Databricks' most valuable feature is the data transformation through PySpark.
The ease of use and its accessibility are valuable.
The setup is quite easy.
What I like about Databricks is that it's one of the most popular platforms that give access to folks who are trying not just to do exploratory work on the data but also go ahead and build advanced modeling and machine learning on top of that.
Databricks has improved my organization by allowing us to transform data from sources to a different format and feed that to the analytics, business intelligence, and reporting teams. This tool makes it easy to do those kinds of things.
The most valuable feature of Databricks is the notebook, data factory, and ease of use.
We can scale the product.
The Delta Lake data type has been the most useful part of this solution. Delta Lake is an opensource data type and it was implemented and invented by Databricks.
The solution's features are fantastic and include interactive clusters that perform at top speed when compared to other solutions.
Easy to use and requires minimal coding and customizations.
The most valuable feature of Databricks is the integration with Microsoft Azure.
I like that Databricks is a unified platform that lets you do streaming and batch processing in the same place. You can do analytics, too. They have added something called Databricks SQL Analytics, allowing users to connect to the data lake to perform analytics. Databricks also will enable you to share your data securely. It integrates with your reporting system as well.
Databricks covers end-to-end data analytics workflow in one platform, this is the best feature of the solution.
Databricks' most valuable features are the workspace and notebooks. Its integration, interface, and documentation are also good.
Databricks is a unified solution that we can use for streaming. It is supporting open source languages, which are cloud-agnostic. When I do database coding if any other tool has a similar language pack to Excel or SQL, I can use the same knowledge, limiting the need to learn new things. It supports a lot of Python libraries where I can use some very easily.
The ability to stream data and the windowing feature are valuable.
I like how easy it is to share your notebook with others. You can give people permission to read or edit. I think that's a great feature. You can also pull in code from GitHub pretty easily. I didn't use it that often, but I think that's a cool feature.
Databricks has a scalable Spark cluster creation process. The creators of Databricks are also the creators of Spark, and they are the industry leaders in terms of performance.
The technical support is good.
Databricks is a scalable solution. It is the largest advantage of the solution.
I like the ability to use workspaces with other colleagues because you can work together even without seeing the other team's job.
Can cut across the entire ecosystem of open source technology to give an extra level of getting the transformatory process of the data.
The solution is easy to use and has a quick start-up time due to being on the cloud.
The main features of the solution are efficiency.
The initial setup is pretty easy.
The solution is very easy to use.
It can send out large data amounts.
Databricks gives you the flexibility of using several programming languages independently or in combination to build models.
Databricks helps crunch petabytes of data in a very short period of time.
The integration with Python and the notebooks really helps.
One of the features provides nice interactive clusters, or compute instances that you don't really need to manage often.
It's great technology.
The fast data loading process and data storage capabilities are great.
Ability to work collaboratively without having to worry about the infrastructure.
It's easy to increase performance as required.
The most valuable feature is the ability to use SQL directly with Databricks.
Imageflow is a visual tool that helps make it easier for business people to understand complex workflows.
I haven't heard about any major stability issues. At this time I feel like it's stable.
The time travel feature is the solution's most valuable aspect.
I work in the data science field and I found Databricks to be very useful.
The most valuable aspect of the solution is its notebook. It's quite convenient to use, both terms of the research and the development and also the final deployment, I can just declare the spark jobs by the load tables. It's quite convenient.
Databricks is based on a Spark cluster and it is fast. Performance-wise, it is great.
Automation with Databricks is very easy when using the API.
We are completely satisfied with the ease of connecting to different sources of data or pocket files in the search
The built-in optimization recommendations halved the speed of queries and allowed us to reach decision points and deliver insights very quickly.