how to disbale serverless for azure databricks notebook
I want to restrict the serverless in notebook in my dev workspace and i want to disable the feature . i am not getting option where to disable it .
How to allow Databricks to only read from one storage account and read/write from a different storage account?
RBAC setup: Under the same subscription I have resource group 1 and resource group 2. There is a Databricks instance created in resource group 1 in which a service principal was created. Within the resource group 1 storage account the service principal…
Retrieving data from Azure storage account archive tier through Azure Databricks
Hi Community, Recently we are studying the practice to perform data retention in Azure. We are using Databricks and Storage Account as data lake. We are thinking of using Databricks to extract part of the data from delta table and write them to the…
How to connect On prem tableau server to Azure Databricks
Trying to establish a connection between on prem tableau server to Azure Databricks. Do we have to create App registration for the tableau server?
Your Role: Global Administrator and 1 other roles does not work with Databricks
Hi, Your Role: Global Administrator and 1 other roles. This I can see In Roles and Administrators. But I cannot - from Databricks - get to Manage Account from email on top right, which means I am not Global Administrator. What is the problem? Seems…
Queries regarding Serverless compute in Azure Databricks
We have received a service health advisory from Microsoft with ID "PTC3-9V8" regarding the Serverless compute transition on Azure Databricks by 24 March 2025. After going through the advisory, we have few queries which we want to…
How can I prohibit ordinary users to execute notebooks of shared folder in Azure Databricks?
Hello, How can I prohibit ordinary users to create, modify, and execute notebooks of shared folder in Azure Databricks? For security reasons, I want to prevent ordinary users from accidentally sharing the results of SELECT sensitive data in the Shared…
Error "Invalid configuration value detected for fs.azure.account.keyInvalid configuration value detected for fs.azure.account.key" when listing files stored in an Azure Storage account using "dbutils.fs.ls"
Hi, I want to get a list of files stored in an Azure Storage account using "dbutils.fs.ls" command in Databricks. But, I get the following error. Failure to initialize configuration for storage account AAAAAA.dfs.core.windows.net: Invalid…
Azure Databricks Too Many Requests errors
We are getting many errors with loading notebooks and also now running jobs on clusters due to Databricks saying it has too many requests. For example, getting the below error message: run failed with error message Cluster '0724-103023-f2llqh3p' was…
Unity Catalog is not enabled Please contact your workspace administrator to enable Unity Catalog to use Genie.
Unity Catalog is not enabled Please contact your workspace administrator to enable Unity Catalog to use Genie. Can not find how to check above.
Error when creating table in Unity Catalog enabled cluster with 14.3 version
I am trying to create a table in schema using dbutils.fs.ls("abfss://****@storageaccount.dfs.core.windows.net/fullpath/") in Unity Catalog enabled cluster with 14.3 version,** but I keep getting this error message: "AnalysisException:…
Databricks Simba Spark ODBC .NET8 C# Driver Parameters in SQL Queries
Hello, I'm using Simba ODBC driver v2.8.0 in order to query data from my azure databrick sql warehouse into a .net 8 Asp.net Api App. The ODBC driver works fine using plain text query but i need to parametrize the query. Searching around I found that it…
Basic SKU public IP retirement Databricks query
With the Retirement Announcement - Basic SKU Public IP addresses will be retired on 30 September 2025 is there any change that needs to be made to Azure Databricks workers? Currently we've got a number of workers that auto scale which all use Basic SKU.…
Automate Lineage into azure purview from Azure Databricks Unity Catalog
Hi Team, In my last question someone from Microsoft team suggested to create lineage tables separately and scan the databricks unity catalog source and then lineage automatically came to azure purview. But when I tried this solution, I'm not able to get…
Does Databricks permit an enterprise to maintain multiple Databricks accounts within a single region?
Does Databricks permit an enterprise to maintain multiple Databricks accounts within a single region in Azure? We need two isolated Unity Catalog metastores in the same region.
How to set up secrets in Azure Databricks Cluster configs?
Hi there, I am currently working on setting up Spark configurations within our Databricks cluster to access Azure Data Lake Storage (ADLS) using OAuth and a service principal. Our goal is to configure these settings at the cluster level, so that we can…
I need to download files from databricks onto my local machine
My attempt at using the url method returns a 401 error "credential was not sent or was of an unsupported type for this API" My attempt as using the cdi is also unsuccessful. I do not have the download option available in the dbfs, I have the…
Unexpected Azure Databricks VM Pricing – Matches RI Cost Without RI Purchase
I have an Azure Databricks customer with both Dev and Prod environments. The concern is regarding the VM cost associated with Databricks. The VM type is D4ads_v5 Linux, located in Southeast Asia. According to the Azure Pricing Calculator, the expected…
Unexpected Azure Databricks VM Pricing – Matches RI Cost Without RI Purchase
I have an Azure Databricks customer with both Dev and Prod environments. The concern is regarding the VM cost associated with Databricks. The VM type is D4ads_v5 Linux, located in Southeast Asia. According to the Azure Pricing Calculator, the expected…
I’ve been using the following code for years to determine the latest table version using Databricks’ Time Travel feature without any issues. However, after adding a new row to the table recently, I encountered the following error:
Cannot time travel Delta table to version 1. Available versions: [3, 23]. This behavior is unexpected, as the code has worked reliably until now. Here is the code I’m using: from delta.tables import DeltaTable import pyspark.sql.functions dt =…