Is there any way to export/download queries in order to move queries from one subscription to another??

Kohdai Kurihara 20 Reputation points
2025-03-03T07:53:14.71+00:00

I have been using Azure Databricks on one subscription and have queries in a workspace. Now I want to move them to another databricks workspace in a different subscription. Is there any way for me to export or download queries?

My current solution is to copy my queries onto a notebook and paste them into the new workspace. I am looking for a more efficient method that doesnt use APIs (due to security reasons).

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,355 questions
{count} votes

Accepted answer
  1. Amira Bedhiafi 29,481 Reputation points
    2025-03-03T12:47:54.51+00:00

    You can use the Databricks CLI to export queries to notebooks and it allows you to automate the process without directly using APIs.

    
    # Install Databricks CLI
    
    pip install databricks-cli
    
    # Configure the CLI with your current workspace
    
    databricks configure --token
    
    # Export queries to notebooks
    
    databricks workspace export_dir /Queries /path/to/local/directory
    
    # Configure the CLI with the new workspace
    
    databricks configure --token
    
    # Import notebooks to the new workspace
    
    databricks workspace import_dir /path/to/local/directory /Queries
    
    

    You can also back up your entire Databricks workspace, including queries, using the Databricks workspace backup feature where you export the workspace to a cloud storage account and then importing it into the new workspace.

    
    # Backup the workspace to a cloud storage account
    
    databricks workspace export_dir / /dbfs/mnt/backup
    
    # Restore the workspace from the backup in the new workspace
    
    databricks workspace import_dir /dbfs/mnt/backup /
    
    

    If security constraints allow, you can use the Databricks REST API to programmatically export and import queries. This method is more efficient but requires API access.

    
    import requests
    
    import json
    
    # Set up API endpoint and token
    
    url = "https://<databricks-instance>/api/2.0/workspace/export"
    
    headers = {"Authorization": "Bearer <your-token>"}
    
    payload = {"path": "/Queries", "format": "SOURCE"}
    
    # Export queries
    
    response = requests.get(url, headers=headers, params=payload)
    
    queries = response.json()
    
    # Import queries to new workspace
    
    url = "https://<new-databricks-instance>/api/2.0/workspace/import"
    
    for query in queries:
    
        payload = {"path": query["path"], "format": "SOURCE", "content": query["content"]}
    
        requests.post(url, headers=headers, json=payload)
    
    

    If your queries are related to Delta tables, you can use Delta Sharing to share data between workspaces. This method is more specific to data sharing but can be useful if your queries are tied to specific datasets.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.