Jaa


Use Unity Catalog service credentials to connect to external cloud services

Important

This feature is in Public Preview.

This article describes how to use a service credential in Unity Catalog to connect to external cloud services. A service credential object in Unity Catalog encapsulates a long-term cloud credential that provides access to an external cloud service that users need to connect to from Azure Databricks.

See also:

Before you begin

Before you can use a service credential to connect to an external cloud service, you must have:

  • An Azure Databricks workspace that is enabled for Unity Catalog.
  • A compute resource that is on Databricks Runtime 15.4 LTS or above. SQL warehouses are not supported.
  • A service credential created in your Unity Catalog metastore that gives access to the cloud service.
  • The ACCESS privilege on the service credential or ownership of the service credential.

Use a service credential in your code

This section provides an example of using a service credential in a notebook. Only Python is supported during the public preview. Replace placeholder values.

Example: configure an Azure SDK client to use a specific service credential

from azure.keyvault.secrets import SecretClient # example Azure SDK client

credential = dbutils.credentials.getServiceCredentialsProvider('your-service-credential')
vault_url = "https://your-keyvault-name.vault.azure.net/"
client = SecretClient(vault_url=vault_url, credential=credential)

Specify a default service credential for a compute resource

You can optionally specify a default service credential for an all-purpose or jobs compute cluster by setting an environment variable. By default, the SDK uses that service credential if no authentication is provided. Users still require ACCESS on that service credential to connect to the external cloud service. Databricks does not recommend this approach, because it makes your code less portable than naming the service credential in your code.

Note

Serverless compute and SQL warehouses don’t support environment variables, and therefore they don’t support default service credentials.

  1. Open the edit page for the cluster.

    See Manage compute.

  2. Click Advanced options at the bottom of the page and go to the Spark tab.

  3. Add the following entry in Environment variables, replacing <your-service-credential>:

    DATABRICKS_DEFAULT_SERVICE_CREDENTIAL_NAME=<your-service-credential>

The following code sample does not specify a service credential. Instead, it uses the service credential specified in the DATABRICKS_DEFAULT_SERVICE_CREDENTIAL_NAME environment variable:

from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient

vault_url = "https://your-keyvault-name.vault.azure.net/"
client = SecretClient(vault_url=vault_url, credential=credential)

Compare to the example in Example: configure an Azure SDK client to use a specific service credential, which does not import DefaultAzureCredential and adds the credential specification:

credential = dbutils.credentials.getServiceCredentialsProvider('your-service-credential')