Platform
- The new unified Schema browser lets you view all of the data in the Unity Catalog metastore without leaving a notebook or the SQL Editor. For more information https://docs.databricks.com/notebooks/notebooks-code.html#browse-data
- Databricks Runtime 13.1 and 13.1 ML are now GA. For more informationhttps://docs.databricks.com/release-notes/runtime/13.1.html
- Databricks Connect V2 is GA for python. For more information https://docs.databricks.com/dev-tools/databricks-connect.html
- Home folders restored when users are re-added to workspaces. For more information https://docs.databricks.com/workspace/workspace-objects.html
- You can now grant Databricks users, service principals and group permissions to use a service principal. This allows users to run jobs as the service principal instead of running as the job owner’s identity. For more information https://docs.databricks.com/workflows/jobs/create-run-jobs.html#run-as-sp
- The full page workspace browser now includes repos.
- You can now test your SSO configuration before enabling it for all the users. For more information https://docs.databricks.com/administration-guide/account-settings-e2/single-sign-on/index.html
- Unified login GA for new accounts Unified login allows you to manage one single sign-on (SSO) configuration in your account that is used for the account and every Databricks workspace. When SSO is enabled on your account, all workspaces, new and existing, will use the account-level SSO configuration, and all users, including account and workspace admins, must sign in to Databricks using SSO.
- Account-level SCIM provisioning now deactivates users when they are deactivated in your identity provider. Previously, when a user was deactivated in an identity provider, account-level SCIM provisioning deleted them from the Databricks account.
- The Databricks command-line interface (Databricks CLI) has undergone a major overhaul. The new CLI covers all Databricks REST API operations and supports all Databricks authentication types. macOS and Linux users can install the new CLI with Homebrew. Windows is also supported. For more information https://docs.databricks.com/dev-tools/cli/databricks-cli.html
- The Databricks SDK for Python is now available as a Beta. The Databricks SDK for Python enables you to automate Databricks accounts, workspaces, and resources by running Python code. For more information https://docs.databricks.com/dev-tools/sdk-go.html
- The Databricks SDK for Go is now available as a Beta. The Databricks SDK for Go enables you to automate Databricks accounts, workspaces, and resources by running Go code. For more information https://docs.databricks.com/dev-tools/sdk-python.html
- Databricks Runtime 13.2 and Databricks Runtime 13.2 ML are now available as Beta releases.
Delta Lake ( DBR 13.2 required)
- Delta Lake Universal Format (UniForm) allows you to read Delta tables with Iceberg clients. For more information https://docs.databricks.com/delta/uniform.html
- Delta Lake liquid clustering replaces table partitioning and ZORDER to simplify data layout decisions and optimize query performance. For more information https://docs.databricks.com/delta/clustering.html
- Archival support for Delta Lake introduces a collection of capabilities that enable you to use cloud-based lifecycle policies on cloud object storage containing Delta tables to move files to archival storage tiers. For more information https://docs.databricks.com/optimizations/archive-delta.html
Governance
- You can now upgrade MLflow Model Registry workflows to govern models through Unity Catalog. Unity Catalog provides centralized access control, auditing, lineage, model sharing across workspaces, and better MLOps deployment workflows. Databricks recommends using Models in Unity Catalog instead of the existing workspace model registry, which will be deprecated in the future.
- System tables are a Databricks-hosted analytical store of an account’s operational data. System tables provide you with easily-accessed account-wide observability data. For more information https://docs.databricks.com/administration-guide/system-tables/index.html
Data Management
- You can now see data quality metrics for pipelines running in continuous mode when you view dataset details in the Delta Live Tables UI. Previously, data quality metrics displayed for only triggered pipelines. For more information https://docs.databricks.com/delta-live-tables/observability.html#dataset-details
- Pipelines that use Unity Catalog can now write to catalogs that have a custom storage location. Data is persisted in the catalog storage location when the location is specified. Otherwise, data is persisted in the metastore root location. For more information https://docs.databricks.com/delta-live-tables/unity-catalog.html
Marketplace
- New providers have published listings in Databricks Marketplace ( IQVIA, Morningstar, Vaisala, Sucali, Veritas, Collectors Data Store, Tego Cyber, Ribbon Health)
- Data providers can now create and update their own profiles, once their provider application is approved. For more information https://docs.databricks.com/marketplace/get-started-provider.html#profile
- Consumers can uninstall data products using the UI. For more information https://docs.databricks.com/marketplace/manage-requests-consumer.html#delete
- Private exchanges enable data providers to share data products with a select group if invited customers. For more information https://docs.databricks.com/marketplace/private-exchange.html
Databricks SQL
- SQL tasks in Workflows are now generally available. You can orchestrate Queries, Dashboards, and Alerts from the Workflows page.
- A new schema browser is now in Public Preview, featuring an updated UX, a For You tab, and improved filters. The schema browser is available in Databricks SQL, Data Explorer, and notebooks.
- New SQL built-in functions, such as array_prepend(array, elem), try_aes_decrypt(expr, key [, mode [, padding]]), and sql_keywords() .
- You can now use shallow clone to create new Unity Catalog managed tables from existing Unity Catalog managed tables. You can now use CLONE and CONVERT TO DELTA with Iceberg tables that have partitions defined on truncated columns of types int, long, and string. Truncated columns of type decimal are not supported.