Jaa


Enable Hive metastore federation for a legacy workspace Hive metastore

Important

This feature is in Public Preview.

This article shows how to federate your legacy Azure Databricks Hive metastore so that your organization can work with your Hive metastore tables using Unity Catalog.

For an overview of Hive metastore federation, see Hive metastore federation: enable Unity Catalog to govern tables registered in a Hive metastore.

Before you begin

Review the list of services and features supported by Hive metastore federation: Requirements, supported features, and limitations.

Specific requirements are listed for each step below.

Step 1: Connect Unity Catalog to your Hive metastore

In this step, you create a connection, a Unity Catalog securable object that specifies a path and credentials for accessing a database system, in this case your Azure Databricks workspace’s legacy Hive metastore.

Requirements

  • The CREATE CONNECTION privilege on the Unity Catalog metastore.

Create the connection

A connection specifies a path and credentials for accessing an external database system. To create a connection, you can use Catalog Explorer or the CREATE CONNECTION SQL command in an Azure Databricks notebook or the Databricks SQL query editor.

Note

You can also use the Databricks REST API or the Databricks CLI to create a connection. See POST /api/2.1/unity-catalog/connections and Unity Catalog commands.

Permissions required: Metastore admin or user with the CREATE CONNECTION privilege.

Catalog Explorer

  1. In your Azure Databricks workspace, click Catalog icon Catalog.
  2. On the Quick access page, click Add data > Add a connection.
  3. Enter a user-friendly Connection name.
  4. Select a Connection type of Hive Metastore and Metastore type of Internal.
  5. (Optional) Add a comment.
  6. Click Create.

SQL

Run the following command in a notebook or the Databricks SQL query editor.

CREATE CONNECTION IF NOT EXISTS <connection-name> TYPE hive_metastore
  OPTIONS (builtin true);

Step 2: Create external locations for data in your internal legacy Hive metastore

In this step, you configure an external location in Unity Catalog to govern access to the cloud storage locations that hold the data registered in your legacy internal Hive metastore.

External locations are Unity Catalog securable objects that associate storage credentials with cloud storage container paths. See External locations and storage credentials.

Requirements

For cloud storage and Azure Databricks permission requirements, see “Before you begin” in Create an external location to connect cloud storage to Azure Databricks.

If you are creating an external location for your workspace’s legacy DBFS root, see instead “Before you begin” in Create an external location for data in DBFS root.

Options for creating the external location

The process that Databricks recommends for creating an external location in Unity Catalog depends on your situation:

Enable fallback mode on external locations

As soon as you create an external location in Unity Catalog, access to the path represented by that external location is enforced by Unity Catalog permissions when you run queries on Unity Catalog-enabled compute (single user, shared, or SQL warehouse). This can interrupt existing workloads that don’t have the correct Unity Catalog permissions to access the path.

When an external location is in fallback mode, the system first checks the querying principal’s Unity Catalog permissions on the location, and if that doesn’t succeed, falls back to using existing cluster- or notebook-scoped credentials, such as instance profiles or Apache Spark configuration properties, so that your existing workloads continue to run uninterrupted.

Fallback mode is convenient when you are in the process of migrating your legacy workload. Once you’ve updated your workloads to run successfully using Unity Catalog permissions, you should disable fallback mode to prevent legacy cluster-scoped credentials from being used to bypass Unity Catalog data governance.

You can enable fallback mode using Catalog Explorer or the Unity Catalog external locations REST API.

Permissions required: Owner of the external location.

Catalog Explorer

  1. In your Azure Databricks workspace, click Catalog icon Catalog.
  2. On the Quick access page, click External data >.
  3. Select the external location you want to update.
  4. Turn on the Fallback mode toggle and click Enable to confirm.

API

The following curl examples show how to enable fallback mode when you create an external location and when you update an existing external location.

Creating a new external location:

curl -X POST -H 'Authorization: Bearer <token>' \
https://<workspace-URL>/api/2.1/unity-catalog/external-locations \
--data
'{
  "name": "fallback_mode_enabled_external_location",
  "url": "abfss://container-name@storage-account.dfs.core.windows.net/external_location_container/url",
  "credential_name": "external_location_credential",
  "fallback": true
  "skip_validation": true
}'

Updating an external location:

curl -X PATCH \
-H 'Authorization: Bearer <token>' \
-H 'Content-Type: application/json' \
https://<workspace-URL>/api/2.1/unity-catalog/external-locations/<external-location-name> \
--data
 '{
   "comment": "fallback mode enabled",
   "fallback": true
  }'

Step 3: Create a federated catalog

In this step, you use the connection that you created in step 1 to create a federated catalog in Unity Catalog that points to the external location you created in step 2. A federated catalog is a type of foreign catalog, which is a securable object in Unity Catalog that mirrors a database or catalog in an external data system, enabling you to perform queries on that data in your Azure Databricks workspace, with access managed by Unity Catalog. In this case, the mirrored catalog is your data registered in a Hive metastore.

Any time a user or workflow interacts with the federated catalog, metadata is synced from the Hive metastore.

Requirements

Permission requirements:

To create the foreign catalog:

  • The CREATE CATALOG privilege on your Unity Catalog metastore.
  • Either ownership of the connection or the CREATE FOREIGN CATALOG privilege on the connection.
  • To enter authorized paths for the foreign catalog, you must have the CREATE FOREIGN SECURABLE privilege on an external location that covers those paths. The owner of the external location has this privilege by default.

To work with the foreign catalog:

  • Ownership of the catalog or USE CATALOG

Compute requirements:

  • To create the catalog using Catalog Explorer: no compute required.
  • To create the catalog using SQL: Databricks Runtime 13.3 LTS or above.
  • To work with the catalog: a shared cluster on Databricks Runtime 13.3 LTS, 14.3 LTS, 15.1 or above.

Create the foreign catalog

To create a foreign catalog, you can use Catalog Explorer or the CREATE FOREIGN CATALOG SQL command in an Azure Databricks notebook or the SQL query editor.

Note

You can also use the Unity Catalog API. See Create a catalog in the Databricks REST API reference.

  1. In your Azure Databricks workspace, click Catalog icon Catalog to open Catalog Explorer.

  2. On the Quick access page, click the Add data botton and select Add a catalog .

  3. Enter a Catalog name and select a catalog Type of Foreign.

  4. Select the Connection that you created in Step 1 from the drop-down.

  5. In the Authorized paths field, enter paths to the cloud storage locations that you defined as external locations in Step 2. For example, abfss://container@storageaccount.dfs.core.windows.net/demo,abfss://container@storageaccount.dfs.core.windows.net/depts/finance.

    Authorized paths are an added layer of security for federated catalogs only. See What are authorized paths?.

  6. Click Create.

  7. (Optional) Click Configure to open a wizard that walks you through granting permissions on the catalog and adding tags. You can also perform these steps later.

    See Manage privileges in Unity Catalog and Apply tags to Unity Catalog securable objects.

  8. (Optional) Bind the catalog to specific workspaces.

    By default, catalogs can be accessed from any workspace attached to the Unity Catalog metastore (restricted by user privileges). If you want to allow access only from specific workspaces, go to the Workspaces tab and assign workspaces. See Limit catalog access to specific workspaces.

  9. Populate the federated catalog with the Hive metastore metadata.

    Any time a user or workflow interacts with the federated catalog, metadata is synced from the Hive metastore. The first interaction populates the catalog in Unity Catalog and makes its contents visible in the Catalog Explorer UI. You can populate the catalog by selecting and starting a supported compute resource in Catalog Explorer. You must be the catalog owner (which you are by virtue of creating the catalog) or a user with the USE CATALOG privilege.

    Run the following SQL command in a notebook or the SQL query editor. Items in brackets are optional. Replace the placeholder values:

    • <catalog-name>: Name for the catalog in Azure Databricks.
    • <connection-name>: The name of the connection object that you created in Step 1.
    • <path1>,<path2>: Paths to the cloud storage locations that you defined as external locations in Step 2. For example, abfss://container@storageaccount.dfs.core.windows.net/demo,abfss://container@storageaccount.dfs.core.windows.net/depts/finance. Authorized paths are an added layer of security for federated catalogs only. See What are authorized paths?.
    CREATE FOREIGN CATALOG [IF NOT EXISTS] <catalog-name> USING CONNECTION <connection-name>
    OPTIONS (authorized_paths '<path1>,<path2>');
    

    Any time a user or workflow interacts with the federated catalog, metadata is synced from the Hive metastore.

See also Manage and work with foreign catalogs.