Configuration profiles for the Databricks CLI
Note
This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.
Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.
This article describes how to use the Databricks CLI with configuration profiles. It assumes that you have already installed the Databricks CLI and created a Databricks configuration profiles file. See Install or update the Databricks CLI and Azure Databricks configuration profiles.
Get information about configuration profiles
Adding multiple configuration profiles to the .databrickscfg
file enables you to quickly run commands across various workspaces by specifying the target configuration profile’s name in the command’s --profile
or -p
flag. If it is not specified, the DEFAULT
configuration profile is used.
Tip
You can press Tab
after --profile
or -p
to display a list of existing available configuration profiles from which to choose.
For example, you could have a configuration profile named DEV
that references an Azure Databricks workspace that you use for development workloads and a separate configuration profile named PROD
that references a different Azure Databricks workspace that you use for production workloads.
By default, the Databricks CLI looks for the .databrickscfg
file in your ~
(your user home) folder on Unix, Linux, or macOS, or your %USERPROFILE%
(your user home) folder on Windows. To change the default path of the .databrickscfg
file, set the environment variable DATABRICKS_CONFIG_FILE
to a different path. See .databrickscfg-specific environment variables and fields.
To get information about an existing configuration profile, run the auth env
command:
databricks auth env --profile <configuration-profile-name>
# Or:
databricks auth env --host <account-console-url>
# Or:
databricks auth env --host <workspace-url>
For example, here is the output for a profile that is configured with Azure Databricks personal access token authentication:
{
"env": {
"DATABRICKS_AUTH_TYPE": "pat",
"DATABRICKS_CONFIG_PROFILE": "DEFAULT",
"DATABRICKS_HOST": "https://dbc-a1b2345c-d6e7.cloud.databricks.com",
"DATABRICKS_TOKEN": "dapi123..."
}
}
To get information about all available profiles, run the auth profiles
command:
databricks auth profiles
Name Host Valid
DEFAULT https://dbc-a1b2345c-d6e7.cloud.databricks.com YES
Development https://dbc-a1b2345c-d6e7.cloud.databricks.com YES
Staging https://dbc-a1b2345c-d6e7.cloud.databricks.com YES
Production https://dbc-a1b2345c-d6e7.cloud.databricks.com YES
The output of the auth profiles
command does not display any access tokens. To display an access token, run the preceding auth env
command.
Important
The Databricks CLI does not work with a .netrc file. You can have a .netrc
file in your environment for other purposes, but the Databricks CLI will not use that .netrc
file.
Test your configuration profiles
To test your configuration profiles and verify that you have set up authentication correctly, run a command that connects to a workspace.
If you don’t specify a profile, the default profile is used. For example, the following command lists the available Databricks Runtime versions for the Azure Databricks workspace that is associated with your DEFAULT
profile.
Note
This command assumes that you do not have any environment variables set that take precedence over the settings in your DEFAULT
profile. For more information, see Authentication order of evaluation.
databricks clusters spark-versions
To verify a specific configuration profile, provide the profile name using the -p
flag.
databricks clusters spark-versions -p PROD