Configure Delta storage credentials
Note
To configure Delta storage credentials, see Configure access to cloud object storage for Azure Databricks. Databricks no longer recommends passing storage credentials through DataFrame options as described in this article.
Azure Databricks stores data for Delta Lake tables in cloud object storage. Configuring access to cloud object storage requires permissions within the cloud account that contains your storage account.
Pass storage credentials as DataFrame options
Delta Lake supports specifying storage credentials as options for DataFrameReader and DataFrameWriter. You might use this if you need to interact with data in several storage accounts governed by different access keys.
Note
This feature is available in Databricks Runtime 10.4 LTS and above.
For example, you can pass your storage credentials through DataFrame options:
Python
df1 = spark.read \
.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", "<storage-account-access-key-1>") \
.read("...")
df2 = spark.read \
.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", "<storage-account-access-key-2>") \
.read("...")
df1.union(df2).write \
.mode("overwrite") \
.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", "<storage-account-access-key-3>") \
.save("...")
Scala
val df1 = spark.read
.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", "<storage-account-access-key-1>")
.read("...")
val df2 = spark.read
.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", "<storage-account-access-key-2>")
.read("...")
df1.union(df2).write
.mode("overwrite")
.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", "<storage-account-access-key-3>")
.save("...")