The SunnyData Blog
Explore insights and practical tips on mastering Databricks Data Intelligence Platform and the full spectrum of today's modern data ecosystem.
Cost allocation, cloud tags, and other relevant things on Databricks
In this three-part video series, Databricks MVP Josue Bogran and Greg Kroleski from Databricks’ “Money Team” discuss cost allocation, tagging best practices, cloud innovations, and performance enhancements. The series explores Databricks' efforts to optimize cost reporting, leverage AI for insights, and empower businesses to make smarter, growth-driven decisions in 2024 and beyond.
5 Reasons Why We Recommend Databricks
Discover why Databricks stands out as the leading data platform in this insightful blog. From unified data management to cost efficiency, unmatched performance, and robust analytics, Josue Bogran explains the top 5 reasons Databricks excels in the competitive landscape. Learn how Databricks balances innovation, user-centric design, and industry versatility to deliver exceptional results.
Performance, Benchmarks, and Optimization Tips for Databricks Users
Josue Bogran interviews Jeremy Lewallen from Databricks’ Performance Team, exploring benchmarks, storage cost optimization, rightsizing SQL Serverless Compute, and common compute mistakes. Discover why Databricks continually enhances performance, tips for using the latest DBRs, and how their innovations provide a fast, efficient, and developer-friendly data platform.
Why Startups should consider Databricks as a top choice for their data platform for analytics, AI and data management.
Databricks is a top choice for startups seeking an all-in-one data platform for analytics, AI, and data management. Its cloud-native, scalable, and cost-efficient architecture allows startups to begin small, grow, and avoid complex migrations as needs evolve.
Why no one migrates from Databricks to Snowflake
This blog explores why companies are increasingly migrating from Snowflake to Databricks, highlighting Databricks’ integrated platform, cost-efficiency, and comprehensive data and AI capabilities. The post examines the pitfalls of Snowflake's pricing and its struggle to replicate Databricks' functionality.
Why Migrate to Databricks?
Migrating to Databricks offers companies unmatched benefits, including cost efficiency, scalability, and advanced AI/ML capabilities. With a unified platform and robust data governance, Databricks empowers businesses to modernize their data estates, ensuring they’re future-ready and positioned for growth.
Databricks Myths vs. My Own Personal Experience
Transitioning to Databricks reduced costs by $19K/month and streamlined data operations. Learn how Databricks' unified platform simplifies data engineering and boosts efficiency in our latest blog.
Evaluating Databricks' Cost Control Features: A Closer Look at Budgets and Cost Dashboard
Evaluating Databricks' cost control features reveals strengths in granular tracking and user-friendly budgeting, but highlights areas for improvement in automated alerts and comprehensive expense tracking. Explore our insights on Databricks' budgeting tools.
Winter Is So Over: Quick Guide on the 18 Big Announcements
Here we present Databricks Summit 2024 18 biggest announcements, including Generative AI tools, AI-powered BI dashboards, open-source Unity Catalog, no-code data ingestion, serverless compute, secure data collaboration, built-in AI functions for data warehousing, and the all-new Delta Lake 4.0. Discover how these advancements can unlock the full potential of your data. Get the details and see how SunnyData can help - read now!
Cost Saving Best Practices For Databricks Workflows
Discover how to manage pipeline costs effectively with Databricks Workflows. This article offers practical tips to reduce total cost of ownership without sacrificing performance, and provides insights into understanding your costs better. Learn strategies like using job compute and spot instances, setting warnings and timeouts, leveraging task dependencies, and implementing autoscaling and tagging. Optimize your resource usage and get the most out of your Databricks environment. Read on for actionable advice to streamline your data processes.
A Roadmap to a Successful AI Project: Planning, Execution & ROI
Disappointed by the high failure rate of AI initiatives? You're not alone. This post unveils the secrets to planning and executing a successful AI project. We detail our proven methodology for ensuring AI success, focusing on identifying high-ROI use cases and navigating the Proof-of-Concept stage effectively. Learn how to choose the right supplier, avoid common pitfalls, and ensure your AI project transitions smoothly from pilot to production, delivering real business results and a competitive edge.
Databricks Model Serving for end-to-end AI life-cycle management
In the evolving world of AI and ML, businesses demand efficient, secure ways to deploy and manage AI models. Databricks Model Serving offers a unified solution, enhancing security and streamlining integration. This platform ensures low-latency, scalable model deployment via a REST API, perfectly suited for web and client applications. It smartly scales to demand, using serverless computing to cut costs and improve response times, providing an effective, economical framework for enterprises navigating the complexities of AI model management.
What is Photon in Databricks and, Why should you use it?
Photon, a C++ native vectorized engine, boosts query performance by optimizing SQL and Spark queries. It aims to speed up SQL workloads and cut costs. This blog will help you understand Photon's role in enhancing Databricks, ensuring you grasp its significance by the end.