Spark config through Databricks

Mxxx 0 Reputation points
2024-11-20T06:56:00.0466667+00:00

Hi, I am trying to access storage account through databricks (thru notebook) with storage access key.

I keep encountering error when I do spark.config.set

spark.conf.set(
        f"fs.azure.account.key.{storage_account_name}.blob.core.windows.net",
        access_key
    )

I tried many alternative code with spark.conf.set, but I keep encountering the same error message

Configuration fs.azure.account.key.{storage_account_name}.blob.core.windows.net is not available. SQLSTATE: 42K0I

May I know what could be the issue?
I am using serverless compute.

Thank you.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,236 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Deepanshukatara-6769 10,765 Reputation points
    2024-11-20T07:01:10.53+00:00

    Hello , Welcome to MS Q&A

    To access a storage account through Databricks using a storage access key, you can set Spark properties in your Databricks notebook or cluster configuration. Here’s how to do it:

    spark.conf.set(
        "fs.azure.account.key.<storage-account>.dfs.core.windows.net",
        dbutils.secrets.get(scope="<scope>", key="<storage-account-access-key>")
    )
    
    
    

    Once configured, you can interact with resources in the storage account using URIs. It is recommended to use the abfss driver for enhanced security.

    References:

    Kindly accept answer if it works

    Please let us know if any questions

    Thanks

    Deepanshu


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.