Tutorial: Create and use a Databricks secret

In this tutorial, you use Databricks secrets to set up JDBC credentials for connecting to an Azure Data Lake Storage account.

Step 1: Create a secret scope

Create a secret scope called jdbc.

databricks secrets create-scope jdbc

To create an Azure Key Vault-backed secret scope, follow the instructions in Manage secret scopes.

Step 2: Add secrets to the secret scope

Add the secrets username and password. Run the following commands and enter the secret values in the opened editor.

databricks secrets put-secret jdbc username
databricks secrets put-secret jdbc password

Step 3: Use the secrets in a notebook

Use the dbutils.secrets utility to access secrets in notebooks.

The following example reads the secrets that are stored in the secret scope jdbc to configure a JDBC read operation:

Python

username = dbutils.secrets.get(scope = "jdbc", key = "username")
password = dbutils.secrets.get(scope = "jdbc", key = "password")

df = (spark.read
  .format("jdbc")
  .option("url", "<jdbc-url>")
  .option("dbtable", "<table-name>")
  .option("user", username)
  .option("password", password)
  .load()
)

Scala

val username = dbutils.secrets.get(scope = "jdbc", key = "username")
val password = dbutils.secrets.get(scope = "jdbc", key = "password")

val df = spark.read
  .format("jdbc")
  .option("url", "<jdbc-url>")
  .option("dbtable", "<table-name>")
  .option("user", username)
  .option("password", password)
  .load()

The values fetched from the scope are redacted from the notebook output. See Secret redaction.

Step 4: Grant a group permissions on the secret scope

Note

This step requires the Premium plan.

After verifying that the credentials were configured correctly, you can grant permissions on the secret scope to other users and groups in your workspace.

Grant the datascience group the READ permission to the secret scope:

databricks secrets put-acl jdbc datascience READ

For more information about secret access control, see Secret ACLs.