Resources and insights
Our Blog
Explore insights and practical tips on mastering Databricks Data Intelligence Platform and the full spectrum of today's modern data ecosystem.
Managed Iceberg Tables
Learn when to choose Apache Iceberg over Delta tables in Databricks. Complete guide covering manifest files, CDC limitations, liquid partitioning, and table properties with practical examples.
Unity Catalog to Azure Key Vault: No more dbutils.secrets()
Learn how to securely connect Azure Databricks to Key Vault using Unity Catalog Service Credentials for enterprise-grade secret management and governance.
Stop ELT Headaches: Why We Partner with Fivetran + Databricks
Discover how SunnyData overcame ELT challenges by partnering with Fivetran and Databricks, creating reliable data pipelines that eliminate late-night fixes and accelerate insights.
Databricks AI Assistant: SQL Review
Databricks' AI Assistant is now GA and excels in SQL tasks, leveraging Unity Catalog metadata. It consistently delivers functional SQL queries and follows instructions well, proving itself superior to alternatives like ChatGPT for SQL code generation. Watch the video for insights.
Elevating the Notebook Experience with Databricks' Latest Upgrade
Databricks' latest notebook upgrade offers superior design and performance, versatile language support, and improved user experience, making it a standout product for data analysis and exploration.
Databricks AI/BI Series: AI/BI Dashboards
Databricks is making strides in AI/BI Dashboards with enhanced data prep and intuitive UI. Discover its pros, cons, and future potential in our latest blog.
Navigating Data Governance with Unity Catalog: A Practical Exploration
This comprehensive guide delves into the essential features of Unity Catalog by Databricks, highlighting its role in enhancing security, automating data documentation, and streamlining ML and AI governance. Learn practical tips on integrating this powerful tool into your data strategy to boost productivity and ensure compliance. Whether you're scaling up or enhancing data discoverability, this article is your roadmap to leveraging data for innovation while maintaining robust security.
Unity Catalog and Enterprise Data Governance Tools: How Should They Fit In Your Stack
In this blog, we address whether Unity Catalog can replace existing enterprise catalogs when integrated with Databricks. We clarify that while Unity Catalog excels at centralizing governance and enhancing data management within Databricks, it complements rather than replaces established catalogs like Alation or Collibra if they already add significant value.
With extensive experience in data solutions, our CEO Kai, notes that Unity Catalog is indispensable for effectively managing permissions and access across data, ML, and AI assets. However, for broader governance needs, using it alongside other data catalogs ensures comprehensive management across all data systems.
Navigating Data Governance with Unity Catalog: Enhancing Security and Productivity
Unity Catalog from Databricks is revolutionizing how businesses manage their data, providing a unified governance platform that centralizes control over data and AI assets. It enhances productivity, bolsters security, and streamlines compliance by offering a single, searchable repository for all data assets.
The platform automates data documentation with Generative AI, easing the workload on data stewards and enriching data management with semantic searches and interactive visualizations. Additionally, Unity Catalog's Lakehouse Federation integrates data across multiple platforms, ensuring seamless data accessibility. Its advanced data lineage capabilities offer clear visibility into data movements, crucial for compliance and informed decision-making, making it a strategic asset for any data-driven organization.
Databricks Model Serving for end-to-end AI life-cycle management
In the evolving world of AI and ML, businesses demand efficient, secure ways to deploy and manage AI models. Databricks Model Serving offers a unified solution, enhancing security and streamlining integration. This platform ensures low-latency, scalable model deployment via a REST API, perfectly suited for web and client applications. It smartly scales to demand, using serverless computing to cut costs and improve response times, providing an effective, economical framework for enterprises navigating the complexities of AI model management.
What is Photon in Databricks and, Why should you use it?
Photon, a C++ native vectorized engine, boosts query performance by optimizing SQL and Spark queries. It aims to speed up SQL workloads and cut costs. This blog will help you understand Photon's role in enhancing Databricks, ensuring you grasp its significance by the end.