Overview
Product Information on Databricks Data Intelligence Platform
What is Databricks Data Intelligence Platform?
Databricks Data Intelligence Platform Pricing
Overall experience with Databricks Data Intelligence Platform
“Assistance from Databricks Resolved Technical Challenges in Data Engineering Workflow”
“Constant Updates Create Confusion Amid Impressive Capabilities and Lagging Documentation”
About Company
Company Description
Databricks is a global company focusing on data and AI. At the core of Databricks is the Databricks Data Intelligence Platform which allows entire organizations to use data and AI to power a wide range of business use cases. It's built on a lakehouse to provide an open, unified foundation for all data and governance and is powered by a Data Intelligence Engine that understands the uniqueness of the organizations’ data. Databricks simplifies and accelerates enterprises' data and AI goals by unifying data, analytics and AI on one platform. Its key mission is to assist data teams in addressing some of the world's most challenging problems.
Company Details
Key Insights
A Snapshot of What Matters - Based on Validated User Reviews
User Sentiment About Databricks Data Intelligence Platform
Reviewer Insights for: Databricks Data Intelligence Platform
Deciding Factors: Databricks Data Intelligence Platform Vs. Market Average
Performance of Databricks Data Intelligence Platform Across Market Features
Databricks Data Intelligence Platform Likes & Dislikes
There are many things I find useful. It's a single data platform and for a small operation that means less generalisation to cover many moving parts. It's easy to see where money is spent and then it's easy to ask good questions to optimize the performance. Need only two languages for most work: Python and SQL. The performance of many SQL tasks has been extremely good. It's relatively easy to configure most things. Generally, the documentation is up to date and valid to answer questions.
Read Full ReviewUnity Catalog is amazing for bundling multiple types of data like Delta tables, Models, Functions and Volumes for unstructured data. UC also enables simple connections to External Locations that can be configured for connecting to a cloud provider such as S3. Finally, the Workflows for jobs and pipelines is essential for optimizing runs of code in easy-to-manage objects.
Read Full ReviewIt could do with better dashboard capabilities and the ability for customers to be able to do more with it like dynamically skinning a dashboard dependent on user. A greater array of charting objects would be great.
Read Full ReviewLack of transparency in what is being developed and what has been released. Documentation is very slow to follow after new features are released.
Read Full ReviewOn the downside, there is some complexity to enable all the features on External tables rather than the Managed tables by Databricks (which does not connect to external cloud data like S3). Also, there is some downsides to selecting the different types of Computes; there's the SQL Warehouses which have functionality that Clusters do not have. Similarly, certain Compute modes are restricted to being a Dedicated single-user cluster for accessing ML runtimes, but then only Standard mode clusters can engage with Shallow Clones properly.
Read Full ReviewTop Databricks Data Intelligence Platform Alternatives
Peer Discussions
Databricks Data Intelligence Platform Reviews and Ratings
- DATA & IT SECURITY MANAGER<50M USDIT ServicesReview Source
Assistance from Databricks Resolved Technical Challenges in Data Engineering Workflow
I had previously done training several years ago and decided to use Databricks for some requirements for data engineering and data analysis. I was hitting a wall on an issue. I reached out to a connection on LinkedIn who worked at Databricks to see if they could help me figure out how to get some assistance. A few days later I was advised he had someone and was contacted officially by Databricks. We had a consultation on what issues I had, and they then got straight to work in helping me resolve them. Several days later and the problem had been resolved, and I am now working with them on additional project problems for other work. Determining best approaches and solving issues as they arise - Engineer50M-1B USDServices (non-Government)Review Source
Unity Catalog Simplifies Data Governance, although some Compute Mode limitations
The unity catalog environment enables efficient governance and permissions management. At the core, the Delta Tables are very efficient for housing big data, and leveraging UC workflow jobs has easily modernized our company's approach to creating and engaging with data. - BI Developer50M-1B USDEnergy and UtilitiesReview Source
Databricks ETL Pipelines UI Simplified, but Frequent Updates Can Feel Overwhelming
In regard to the recent updates, Databricks has simplified its ETL pipelines UI which makes it easier to use, apart from that we have recently started using Power BI tasks which makes the integration between Data Lakes and downstream reports, apart from that I think it's truly revolutionary approach in centralized security through Unity Catalog is truly commendable. - Sr Manager, Data Science10B+ USDRetailReview Source
High Computing Power and Versatility in Databricks' Spark Clusters
I use Databricks within the Azure cloud. The seamless integration of Databricks with other Azure components is a big plus. My team consists of data scientists, data engineers and ML engineers. The Spark clusters have delivered fast computing power to our big data environment, allowing us to implement multiple ML models and process billions of data points in reasonable execution times. Coding in PySpark was easy to learn and promoting jobs from the lower environments to production easy to manage. - Data Engineer<50M USDSoftwareReview Source
High Reliability and Collaboration Features Stand Out in Databricks Data Engineering
As a Data Engineer who has built a full data platform on Databricks, I find it highly reliable, scalable, and developer friendly. The tight intergration with Spark, Delta Lake and orchestration tools makes it easy to build and maintain robust data pipelines


