You can use the Databricks CLI to export queries to notebooks and it allows you to automate the process without directly using APIs.
# Install Databricks CLI
pip install databricks-cli
# Configure the CLI with your current workspace
databricks configure --token
# Export queries to notebooks
databricks workspace export_dir /Queries /path/to/local/directory
# Configure the CLI with the new workspace
databricks configure --token
# Import notebooks to the new workspace
databricks workspace import_dir /path/to/local/directory /Queries
You can also back up your entire Databricks workspace, including queries, using the Databricks workspace backup feature where you export the workspace to a cloud storage account and then importing it into the new workspace.
# Backup the workspace to a cloud storage account
databricks workspace export_dir / /dbfs/mnt/backup
# Restore the workspace from the backup in the new workspace
databricks workspace import_dir /dbfs/mnt/backup /
If security constraints allow, you can use the Databricks REST API to programmatically export and import queries. This method is more efficient but requires API access.
import requests
import json
# Set up API endpoint and token
url = "https://<databricks-instance>/api/2.0/workspace/export"
headers = {"Authorization": "Bearer <your-token>"}
payload = {"path": "/Queries", "format": "SOURCE"}
# Export queries
response = requests.get(url, headers=headers, params=payload)
queries = response.json()
# Import queries to new workspace
url = "https://<new-databricks-instance>/api/2.0/workspace/import"
for query in queries:
payload = {"path": query["path"], "format": "SOURCE", "content": query["content"]}
requests.post(url, headers=headers, json=payload)
If your queries are related to Delta tables, you can use Delta Sharing to share data between workspaces. This method is more specific to data sharing but can be useful if your queries are tied to specific datasets.