I want to restrict azure ADLS SFTP access to directory level.
I want to created sftp for 5 user and want to maintain all the sftp folder in one container.
Azure ADLS Gen2
Hi All, I need to know is it possible to replicate data from On-Premises SQL SERVER 2022 to ADLS Gen2 Storage directly. If possible, please provide me some links or give some guidance. Not to use: CDC, Triggers, Third Party tools for ETL Note: I need…
Change Data Storage Region in Fabric
We our Fabric instance hosted in North Europe however the data storage is in South Africa North. I would like to change the storage region from South Africa North to North Europe. This will enable us to create external volumes in Snowflake so that we can…
how to refresh the files from the storage account mounted using the path mapping
I have mounted the Azure data lake storage containers in python azure webapp as path mapping. When reading the files, the files are not up to date. The files on the mounted path are visible immediately after the initial creation, but next changes are…
How to control access to a folder in ADLS gen2 container while Storage account IAMs are in action
Hi, I have a synapse pipeline that saves an output file in a folder (ex: salary) in an ADLS container (ex: employee). Now Mr. X wants the data saved in the folder to be accessible only to him but storage account level IAMs have already given access to…
How to know which Capacity is right for our company while using PowerBI Embedded?
Hello there, We are trying to set up PowerBI Embedded on our application. We're using Data Factory (getting data from MongoDB) to get the data, when running it, we get: "This is a free trial version, to remove this label a capacity must be…
I need to verify my storage account's total capacity limit and check if I am reaching my quota. How can I confirm the maximum storage allocation and expand it if necessary?
I need to verify my storage account's total capacity limit and check if I am reaching my quota. How can I confirm the maximum storage allocation and expand it if necessary?
Logic app to retrieve the latest file from a blob folder
How can I create a logic app that retrieves the latest file from a blob folder when a Http request is received, where there are multiple files, and sends it as an attachment? Are there any specific steps or configurations required for this process?
How can i create a linked service in ADF for Sharepoint online?
I want to extract files from sharepoint to ADL by only using ADF. I followed few steps Step1: Azure Active Directory -> Registered new app -> created new secret key I have the Tenant ID, ClientID(App ID), Secret Key Step2: Sharepoint online ->…
Data Share support ADLS Gen2 with private endpoint
Do data share support ADLS Gen2 that is deployed with Private Endpoint?
Pyspark dataframe is taking too long to save on ADLS from Databricks.
I'm running a notebook on Azure databricks using a multinode cluster with 1 driver and 1-8 workers(each with 16 cores and 56 gb ram). Reading the source data from Azure ADLS which has 30K records. Notebook is consist of few transformation steps, also…
How to use linked service in Notebook with pyspark
I have pyspark script in Notebook to read and write data in ADLS Gen2. Below is an sample of the pyspark script. But in the Synapse I only have a linked service created with Service Principle could connect to the ADLS Gen2, so I need to specify in…
How to specify a custom catalog name for Azure Databricks Delta Lake Dataset in ADF
Hello, I am creating an Azure Databricks Delta Lake Dataset in ADF and I am only able to choose the database name that links to Databricks's hive_metastore. How can I specify a custom catalog name that I created in Databricks instead of…
Medallion architecture in ADLS
I am trying to find the most suitable storage architecture for the following use case. I have several clients and I need isolated storages so data cannot be mixed up I work with 3 different environments for each client: dev, pre, pro I need to…
Connecting to Azure Data Lake Storage Gen2 from Tableau Desktop is throwing error
Error Receiving in Tableau Desktop: Tableau received an OAuth error from your request. Please see the error message for more information: User authorization failed (invalid_client). When checked the auth url it seems the url is the one like below -…
Unable to create a directory named space character using ADLS Gen2 REST API
I am unable to create a directory in "Azure Data Lake Storage Gen2" that is named as just a single space character, despite reviewing the documentation and there being no indication this is a disallowed name. My primary access to "Azure…
generate SAS tokens from serviceprincipal credentials
I am working on to create java client that generates sas tokens for the given service principal credentials. I am taking a reference from…
Need to extract zreports from SAP HANA
Hi, I have a use-case where I want to extract data from SAP Hana. The use-case is as follows: I have SAP Hana deployment from where I need to extract data. The data is stored in zreports which are extracted using T-Codes. Now, I want to extract the data…
We want to know details about azure local , can we use databricks , adlsg2, datafactory, machine learning on azure local ?
We want to know details about azure local , can we use databricks , adlsg2, datafactory, machine learning on azure local?
ADF Copy Data JSON Source dynamic schema mapping
Hi I am working on ADF Copy data activity. HTTP Dataset is returning a JSON with the following sample JSON output { "totalRowCount": 1, "data": [ { "ProductCode": "P - 1", …