Databricks Driver for SQLTools for Visual Studio Code

Important

This feature is in Public Preview.

The Databricks Driver for SQLTools enables you to use the SQLTools extension for Visual Studio Code to browse SQL objects and to run SQL queries in remote Azure Databricks workspaces.

Before you begin

Before you can use the Databricks Driver for SQLTools, your Azure Databricks workspace and your local development machine must meet the following requirements.

Workspace requirements

You must have at least one Azure Databricks workspace available, and the workspace must meet the following requirements:

Local development machine requirements

You must have the following on your local development machine:

  • Visual Studio Code version 1.70 or higher. To view your installed version, click Code > About Visual Studio Code from the manin menu on Linux or macOS and Help > About on Windows. To download, install, and configure Visual Studio Code, see Setting up Visual Studio Code.
  • The SQLTools extension for Visual Studio Code.
  • The Databricks Driver for SQLTools extension for Visual Studio Code.

To install the SQLTools extension, go to SQLTools and then click Install, or:

  1. In Visual Studio Code, click View > Extensions on the main menu.

  2. In the Search Extensions in Marketplace box, enter SQLTools.

  3. Click the SQLTools entry from Matheus Teixeira.

    Note

    There might be multiple SQLTools entries listed. Be sure to click the entry from Matheus Teixeira.

  4. Click Install.

To install the Databricks Driver for SQLTools extension, go to Databricks Driver for SQLTools and then click Install, or:

  1. In Visual Studio Code, click View > Extensions on the main menu.
  2. In the Search Extensions in Marketplace box, enter Databricks Driver for SQLTools.
  3. Click the Databricks Driver for SQLTools entry.
  4. Click Install.

Authentication

You must set up authentication for the Databricks Driver for SQLTools as follows.

The Databricks Driver for SQLTools supports the following Azure Databricks authentication types:

Note

The Databricks Driver for SQLTools does not support Microsoft Entra ID tokens.

Azure Databricks personal access token authentication

To use the Databricks Driver for SQLTools with Azure Databricks personal access token authentication, you must have an Azure Databricks personal access token. To create a personal access token, follow the steps in Azure Databricks personal access tokens for workspace users.

Azure Databricks OAuth machine-to-machine (M2M) authentication

You can use Azure Databricks OAuth machine-to-machine (M2M) authentication to authenticate with the Databricks Driver for SQLTools, as follows:

Note

Azure Databricks OAuth M2M authentication is available in Databricks Driver for SQLTools versions 0.4.2 and above.

  1. Complete the configuration steps for OAuth M2M authentication. See OAuth machine-to-machine (M2M) authentication.
  2. Create an Azure Databricks configuration profile with your OAuth M2M authentication configuration settings. See the “Config” section of OAuth machine-to-machine (M2M) authentication.
  3. Install and open the Databricks extension for Visual Studio Code on your local development machine.
  4. In the Databricks extension for Visual Studio Code, click the Configure button in the Configuration pane. If the Configure button is not displayed, click the gear (Configure workspace) icon instead.
  5. In the Command Palette, for Databricks Host, enter your Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net, and then press Enter.
  6. Select the configuration profile entry that matches the one that you created in step 2.
  7. Complete the on-screen instructions in your web browser to finish authenticating with your Azure Databricks account.

Azure Databricks OAuth user-to-machine (U2M) authentication

You can use Azure Databricks OAuth user-to-machine (U2M) authentication to authenticate with the Databricks Driver for SQLTools, as follows:

Note

Azure Databricks OAuth U2M authentication is available in Databricks Driver for SQLTools versions 0.4.2 and above.

  1. Install and open the Databricks extension for Visual Studio Code on your local development machine.
  2. In the Databricks extension for Visual Studio Code, click the Configure button in the Configuration pane. If the Configure button is not displayed, click the gear (Configure workspace) icon instead.
  3. In the Command Palette, for Databricks Host, enter your Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net. Then press Enter.
  4. Select OAuth (user to machine).
  5. Complete the on-screen instructions in your web browser to finish authenticating with your Azure Databricks account. If prompted, allow all-apis access.

Azure CLI authentication

You can use the Azure CLI to authenticate with the Databricks Driver for SQLTools, as follows:

Note

Authenticating with the Azure CLI is in an Experimental feature state. This feature is available in Databricks Driver for SQLTools versions 0.4.2 and above.

  1. Install the Azure CLI on your local development machine, if you have not done so already.
  2. Install and open the Databricks extension for Visual Studio Code on your local development machine.
  3. In the Databricks extension for Visual Studio Code, click the Configure button in the Configuration pane. If the Configure button is not displayed, click the gear (Configure workspace) icon instead.
  4. In the Command Palette, for Databricks Host, enter your Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net. Then press Enter.
  5. Select Azure CLI.
  6. Follow the on-screen prompts to finish authenticating with the Azure CLI.

Connect to a schema

  1. In Visual Studio Code, on the sidebar, click the SQLTools icon.
  2. In the SQLTools view, if this is your first time using the SQLTools extension, click Add New Connection within the Connections pane. Otherwise, click the Add New Connection icon in the pane’s title bar.
  3. On the SQLTools Settings tab, for the Select a database driver step, click the Databricks icon.
  4. For the Connection Settings step, enter the following information about your warehouse, catalog, and schema:
    1. For Connection name, enter some unique name for this connection.

    2. (Optional) For Connection group enter the name of an existing connection group to add the new connection to that group. Or, enter a unique name to create a new connection group with the new connection. Connection groups make it easier to find connections in the extension.

    3. For Connect using, select one of the following:

      • To use an Azure Databricks personal access token for authentication, select Hostname and Token.
      • For Databricks Driver for SQLTools versions 0.4.2 and above, to use OAuth U2M or M2M or Azure CLI authentication, select VS Code extension (beta).
    4. If you selected Hostname and Token for Connect using, then for Host, enter the warehouse’s Server hostname setting. To get a warehouse’s Server hostname setting, see Get connection details for an Azure Databricks compute resource.

    5. For Path, enter the warehouse’s or cluster’s HTTP path setting. To get a warehouse’s HTTP path setting, see Get connection details for an Azure Databricks compute resource.

    6. If you selected Hostname and Token for Connect using, enter your Azure Databricks personal access token value in Token.

    7. For Catalog, enter the name of your catalog.

      Note

      For workspaces that are not enabled for Unity Catalog, you can leave Catalog blank to use the default value of hive_metastore.

    8. For Schema, enter the name of your schema.

    9. (Optional) For Show records default limit, leave the default of 50 to show only up to the first 50 rows for each query, or enter a different limit.

  5. Click Test Connection.
  6. If the connection test succeeds, click Save Connection.

Change a connection’s settings

This procedure assumes that you have successfully connected to at least one warehouse.

  1. If the SQLTools view is not visible, then in Visual Studio Code, on the sidebar, click the SQLTools icon.
  2. In the Connections pane, expand the connection group, if one exists for your target connection.
  3. Right-click the connection, and click Edit Connection.
  4. Change the target settings.
  5. Click Test Connection.
  6. If the connection test succeeds, click Save Connection.

Browse a schema’s objects

  1. In the Connections pane, expand the connection group, if one exists for your target connection.
  2. Double-click or expand the target connection for your warehouse.
  3. Expand the target database (schema), if one exists for your connection.
  4. Expand Tables or Views, if one or more tables or views exist for your database (schema).
  5. Expand any target table or view to view the table’s or view’s columns.

View the rows or schema for a table or view

With Tables or Views expanded in the Connections pane, do one of the following:

  • To show the table’s or view’s rows, right-click the table or view, and click Show Table Records or Show View Records.
  • To show the table’s or view’s schema, right-click the table or view, and click Describe Table or Describe View.

Generate an insert query for a table

  1. Place your cursor in an existing editor at the location where you want the insert query to be added.
  2. With Tables expanded in the Connections pane, right-click the table, and click Generate Insert Query. The insert query’s definition is added at the cursor’s insertion point.

Create and run a query

This procedure assumes that you have successfully connected to at least one warehouse.

  1. In the Connections pane, expand the connection group, if one exists for your target connection.
  2. Double-click or expand the target connection for your warehouse.
  3. With the connection selected, click New SQL File in the Connections pane’s title bar. A new editor tab appears.
  4. Enter your SQL query in the new editor.
  5. To run the SQL query, click Run on active connection in the editor. The query’s results display in a new editor tab.

Run an existing query

This procedure assumes that you have successfully connected to at least one warehouse.

  1. In the Connections pane, expand the connection group, if one exists for your target connection.
  2. Double-click or expand the target connection for your warehouse.
  3. With the connection selected, open any file with the file extension of .sql, or select any group of continuous SQL statements in any editor that was previously opened.
  4. To run the SQL query from an open .sql file, with your .sql file’s contents displayed in the editor, click Run on active connection in the editor. The query’s results display in a new editor tab.
  5. To run a selected group of continuous SQL statements in an editor that was previously opened, right-click your selection, and then click Run Selected Query. The query’s results display in a new editor tab.

Send usage logs to Databricks

If you encounter issues while using the Databricks Driver for SQLTools, you can send usage logs and related information to Databricks Support by doing the following:

  1. Install the Databricks extension for Visual Studio Code on your local development machine.
  2. Turn on logging by checking the Logs: Enabled setting, or setting databricks.logs.enabled to true, as described in Settings for the Databricks extension for Visual Studio Code Be sure to restart Visual Studio Code after you turn on logging.
  3. Attempt to reproduce your issue.
  4. From the Command Palette (View > Command Palette from the main menu), run the Databricks: Open full logs command.
  5. Send the Databricks Logs.log, databricks-cli-logs.json, and sdk-and-extension-logs.json files that appear to Databricks Support.
  6. Also copy the contents of the Terminal (View > Terminal) in the context of the issue, and send this content to Databricks Support.

The Output view (View > Output, Databricks Logs) shows truncated information if Logs: Enabled is checked or databricks.logs.enabled is set to true. To show more information, change the following settings, as described in Settings for the Databricks extension for Visual Studio Code:

  • Logs: Max Array Length or databricks.logs.maxArrayLength
  • Logs: Max Field Length or databricks.logs.maxFieldLength
  • Logs: Truncation Depth or databricks.logs.truncationDepth

Additional resources