Issue with copy data from Azure Synapse Link for Dataverse using ADF Dataflow
Hi there, We use Export to Data Lake option to copy the data from D365 and want to transform to Azure Synapse Link for Dataverse because Export to Data Lake option is obsoleting from 1st Nov 2024. We have configured Azure Synapse link for Dataverse in…
ADLS storage cost for premium & standard tier
Currently we save about 400GB of data (expect to grow to 2TB on production) in ADLS and select the standard hot tier service. Our monthly bill shows that most of the storage cost (96%) are for the data transaction (iterative read, regular read/write,…
How to connect Business Central to Azure Data Factory
I posted this question on Re: [MicrosoftDocs/azure-docs] Business Central Connector (Issue #88574) But that didn’t help. Is it actually possible to connect Business Central to ADF?
Ingress and Egress cost in azure
I have gone through below thread: https://learn.microsoft.com/en-us/answers/questions/892660/will-the-ingress-egress-costs-of-a-azure-storage-a but still confused and clear about the ingress and egress costs of ADLS, and what parameters it takes to…
Why the size of Data Read in ADF copy activity is actually much bigger than the original size of source data?
I have initiated a data transfer operation from Google Cloud Storage to Azure Data Lake Storage within Azure Data Factory. The objective was to transfer approximately 3,000 files in snappy.parquet format, with a combined size of approximately 30 GB. Upon…
Error trying to validate storage account name while creating a new synapse workspace
I am unable to create a new data lake storage (gen2) account name in the basic tab for create Azure Synapse Workspace. The error I get is - " There was an error trying to validate storage account name. please try again ". The error message is…
Missing Override ARM Template paramater for secret name to connect to ADLS Stoarge
Hi, I have a linked service to connect to development environment of azure data lake storage account. the connection to the storage account is using Azure key vault. while doing the deployment from DEV to UAT, the ARMTemplate has the URL string…
Unable to rename Blob object without adding my public ip
I have a storage account adls gen2 for my Synapse workspace, It's public access is disabled and can be accessible from only selected virtual networks and IP addresses. I have configured a private endpoint. Using VPN I'm able to access all blobs, but I'm…
Blob to storage account
Hi , I have to copy 550 gb of data from blob storage to ADLS gen2 . I have three option CLI , AZcopy and ADF. But over 500 gb of data copy by ADF has performance issues. Please share your suggestion.
SAS token generation by Databricks to access CSV files from ADLS container folder
Hi Team, There are some csv files zips inside the ADLS container folder. These zip files need to be downloaded for data correction. Downloading the file requires SAS token embedded with zip file path. Databricks has been used to generate the token and…
ADF pipeline to read the data from UC table to adls gen2 account
Hello Team, We have a requirement to create Azure Datafactory pipeline to read the data from UC table, access on the table is provided ( to Azure Datafactory Managed Identity) and copy the data into adls gen2. Is there a way or article to implement this?…
Inquiry Regarding the Future of Azure Data Lake Gen2
Hello, For nearly a year, we have been utilizing Azure Data Lake Gen2 to extract data into SQL using Python code. I would like to inquire about the future of Azure Data Lake Storage Gen2: Will Azure Data Lake Storage Gen2 be retired in the near…
DataSourceError from PowerBI when trying to access Synapse Serverless View on ADLS2
We have an issue with a particular user connecting to our Serverless SQL pool via PowerBI. I created an Azure Synapse workspace, which contains the built in serverless SQL pool which has some Views set up to pull data from some Delta tables. As the…
How to fix the problem : 'EndpointUnsupportedAccountFeatures'. Message: 'This endpoint does not support BlobStorageEvents or SoftDelete
I am getting this error while using the sink option in data factory The folder I am trying in path is from storage account.
HttpRequestFailedWithUnauthorizedError output after Copy Data function when extract data from Sharepoint to adl using data factory
I followed the tutorial from "https://www.youtube.com/watch?v=FFfNu3cI-uw&pp=ygUjZXh0cmFjdCBkYXRhIGZyb20gc2hhcmVwb2ludCB0byBhZGw%3D" but this error occurred. "errors": [ { "Code": 22757, "Message":…
Connecting PowerApps to Data Lake Storage tables
Is it possible to connect PowerApps to Data Lake Storage tables in order to create, edit, and delete data within the tables? If so, what are the prerequisites and how can it be achieved? Any guidance or resources would be greatly appreciated.
Creating Items in Datalake creates a additional 0kb file with folder name or filename
We have created two Gen 2 storage account. Creating an item in one data lake container creates a 0 KB file with the same name as that of the file or folder. Attached the screen shot of the container items with the additional file that is created and…
Azure datalake creation methods
Hi friends, we need to create a datalake (ADLS) with bronze, silver, and gold layers. One of our colleagues suggested it is a better idea to create ADLS through programming, but we plan to create it manually. Are there any advantages of each approach …
ADLS Gen2 Query Acceleration
Hello, As a follow up to this thread: Query acceleration using parquet does not work with double fields - Microsoft Q&A I would like to know if and when Microsoft plans to enable the query acceleration feature for Parquet files as well for ADLS Gen2…
Getting the size of parquet files from azure blob storage
I have a blob container abcd The folder structure is like below: abcd/Folder1/Folder a, Folder b…..Folder z Inside a particular Folder a/v1/full/20230505/part12344.parquet Similarly Folder b/v1/full/20230505/part9385795.parquet Scenario is I need to get…