The questions seems more on Azure databricks. You may post it here in dedicated forum for Azure databricks -
Use cases/ scenarios to use the feature "Secure access to Azure Data Lake Storage using Azure Active Directory credential passthrough"
What is the exact use case or motivation of using "Azure AD credential passthrough to ADLS Gen1/Gen2?
I am trying to understand and see if I can use in some scenarios.
For example
Case 1 : I would like to use datalake directly and securely using this feature but I cannot because
1.1 For a high concurrent cluster , I cannot use SCALA programming language which I use in production environment
1.2 For a standard cluster, It has single access user. This means we might need one standard cluster per user if we need this feature?
Case 2: I have scheduled daily jobs and If I want to use datalake directly
2.1 Straight away Jobs are not supported as mentioned in documentation
2.2 Rest API too are not supported, as we can run jobs using Rest API so I consider both points are related technically.
Is my understanding current.
I am interested to know if there are different use cases where this feature can be used.