Reaching New Heights with Databricks: A Feature Roundup for Data-Driven Success

AI & Machine LearningData & AI StrategyData ArchitectureData EngineeringData MigrationsDatabricks
#

Welcome to our latest Databricks New Feature Roundup! In this edition, we’re exploring some of the exciting new capabilities that Databricks has rolled out recently. From a revamped SQL Editor designed for better productivity and real-time collaboration, to cutting-edge integrations that bring more intelligence and automation to your analytics workflows, these updates aim to make your experience on Databricks more powerful, seamless, and cost-efficient. Dive in to discover how these new features can elevate your data and AI initiatives.

New SQL Editor Public Preview

Databricks has rolled out a new SQL Editor that’s currently in public preview, designed to help users interact with SQL-based data more effectively. The new editor introduces multiple features aimed at improving productivity, collaboration, and the user experience. Among the most notable updates are:

 

  • Multiple Statement Results: The ability to run multiple SQL queries at once and see their results side by side, which is helpful for comparison.
  • Real-Time Collaboration: Users can collaborate on SQL code, making teamwork more seamless when interacting with shared datasets.
  • Enhanced Integration with Databricks Assistant: The editor now supports AI-driven features, such as auto-generated filters and suggestions from the Databricks Assistant.
  • Command Palette and File Browser: These features streamline access to key functionalities and navigation, making it easier to interact with files stored in workspaces.
  • Editor Productivity Enhancements: Features like Quick Fix, code folding, and theme customization have been introduced to simplify debugging, improve readability, and allow for a more personalized coding environment.

 

These enhancements significantly elevate the experience of working with SQL within Databricks, offering not just better productivity but also ease of use for less technical stakeholders.

Learn more here

 

General Availability of Databricks Assistant Autocomplete

The Databricks Assistant Autocomplete has now reached general availability, supporting Python and SQL code on all major cloud platforms. This feature leverages AI-powered suggestions that are context-aware, providing developers with real-time code completions as they type.

 

This autocomplete functionality has been fine-tuned specifically for the Databricks ecosystem, and it comes with the benefit of being integrated directly with Unity Catalog. This ensures that all code suggestions respect the existing governance and security protocols of the platform, which is especially crucial when working with sensitive data or regulated environments. Additionally, the system uses retrieval-augmented generation, combining the strengths of an LLM with domain-specific retrieval to make suggestions not just accurate but also highly relevant.

 

Overall, these recent enhancements position Databricks as a more complete and versatile platform for both data analytics and AI, streamlining workflows, optimizing costs, and empowering developers and analysts to be more productive.

Genie Space Enhancements: Benchmarks and Request Review

Databricks has added two features—Benchmarks and Request Review—to improve user confidence in the AI-driven insights provided by the Genie Space. The Benchmarks feature allows creators to establish a set of test questions or challenges for their Genie implementations. By doing so, they can continuously measure and validate the reliability of the outputs and make iterative improvements. This creates a structured method to ensure the quality and accuracy of AI responses.

 

On the other hand, the Request Review feature empowers users who interact with AI-generated outputs to ask for expert validation of the results. If users spot something they are unsure about, they can flag it for a review from a human expert. This crowdsourced verification enhances the transparency and reliability of insights. Both features are about making Genie more trustworthy, involving stakeholders at multiple levels to verify the value of the AI.

Learn more here

 

General Availability of Power BI Integration with Unity Catalog

Databricks has announced the general availability of publishing directly from Unity Catalog to Microsoft Power BI. This feature simplifies the integration of analytics workflows, enabling users to create and publish web reports directly from their Unity Catalog data with minimal manual effort. Unity Catalog now supports schema-level synchronization to Microsoft Power BI, which means data relationships are kept intact throughout the publishing process.

 

In addition, the integration supports Microsoft Entra ID (formerly Azure AD) for authentication and Single Sign-On (SSO), offering a seamless experience while maintaining security and governance compliance. This makes it easier for organizations to bring governed data into their reporting environment, creating always up-to-date BI dashboards, without the friction that previously existed in manual dataset export/import processes.

Learn more here

Cost Attribution for Serverless: Budget Policies Public Preview

Cost management is often a complex challenge in multi-user environments, particularly when utilizing serverless compute. To address this, Databricks has introduced Budget Policies for serverless workloads, which are now in Public Preview. This feature aims to offer transparency and accountability in tracking serverless costs, allowing administrators to define policies that automatically tag serverless resources.

 

Through automatic tagging, organizations can now attribute costs to different departments, projects, or individual users, making it easier to understand where budgets are being spent and to assign chargebacks effectively. This helps not only in optimizing cost efficiency but also in establishing a culture of financial accountability among various teams. Budget policies work seamlessly with existing governance tools and provide alerts for budget overruns, enabling proactive cost control.

Learn more here

 

Batch LLM Inference with Mosaic AI Model Serving

Databricks is introducing batch inference for large language models (LLMs) through Mosaic AI Model Serving, providing a new, scalable way for enterprises to run LLM inference workloads. This approach is designed to support large-scale AI tasks, such as extracting information across thousands of documents, generating bulk content, and transforming data in a unified manner.

 

A key aspect of this new batch LLM inference capability is its cost efficiency, as it allows auto-scaling of resources according to the workload. This means enterprises can optimize the use of computational resources and save on costs, while achieving high throughput for batch tasks. The infrastructure seamlessly integrates with Databricks’ SQL-based implementation, making it easy to use without requiring deep knowledge of specific deployment frameworks. Additionally, this feature reduces data movement, allowing data to remain within the governance boundaries of the Databricks ecosystem.


Learn more here

Conclusion

With all these new features, Databricks continues to empower teams to do more with their data, from improved SQL workflows to advanced AI integrations and cost transparency tools. Whether it’s enhancing productivity, enabling real-time collaboration, or making AI more accessible, these updates underscore Databricks’ commitment to staying ahead of the curve.

If you’re ready to take advantage of these capabilities and see how they can transform your data strategies, our team is here to help. As a Databricks Consulting Partner, we specialize in guiding organizations through every step of their Databricks journey—from migration to optimization. Let us help you unlock the full potential of your data. Get in touch today to start your journey towards a more efficient, insightful, and data-driven future.