Hello , Welcome to MS Q&A
To access a storage account through Databricks using a storage access key, you can set Spark properties in your Databricks notebook or cluster configuration. Here’s how to do it:
spark.conf.set(
"fs.azure.account.key.<storage-account>.dfs.core.windows.net",
dbutils.secrets.get(scope="<scope>", key="<storage-account-access-key>")
)
Once configured, you can interact with resources in the storage account using URIs. It is recommended to use the abfss driver for enhanced security.
References:
Kindly accept answer if it works
Please let us know if any questions
Thanks
Deepanshu