ADLS Gen2 operation failed for: Operation returned an invalid status code 'Conflict'. Account: 'abc'. ErrorCode: 'EndpointUnsupportedAccountFeatures'. Message: 'This endpoint does not support BlobStorageEvents or SoftDelete.
I am new to ADF. while creating linked service for data lake storage Gen2 I am getting a error ADLS Gen2 operation failed. ADLS Gen2 operation failed for: Operation returned an invalid status code 'Conflict'. Account: 'abc'. ErrorCode:…
issues in synapse pipelines which accessing the checkpoints key with input parameters
hello team, i am getting synapse analytics pipelines failures with error referring the checkpoint mistaching with the checkpoint in input parameters. tried to check the input as the respective dataflow is fetching data from cosmosDb with CDC approach…
Troubleshooting Power BI and Azure Storage Explorer Connection Issues to Data Lake Gen2 from Azure VM
I rented a virtual machine (VM) and had no issues using it until I needed to integrate it with the Data Lake to optimize my processes. I subscribed to the Data Lake, but I cannot connect to it through Power BI when using the VM. My account is a Gmail…
When generating a SAS token, specifying the path causes an authorization failure.
I use following code to generate SAS token, when path is null or empty string, it's working. String sasToken = new DataLakeSasImplUtil(signatureValues, container, path, true) .generateUserDelegationSas(userDelegationKey, accountName,…
How to monitor data lake sas token expired?
Hi there, I have store sas token value in key vault, but I can't know when token expires. So I will find this issue after the token expires. Is there any way to monitor when the token expires? Thanks zmsoft
How cn i restrict someone from not downloading any data from Fabric Lakehouse or Fabric warehouse
As a data admin, i want to control data access for a user in your Microsoft Fabric Warehouse. The goal is to allow this user, who has a Contributor role, to view data directly in the workspace without being able to download it as a file. This scenario…
In MS fabric users able to view data but cannot download it
In MS Fabric, I want users to be able to view data in the workspace but not be able to download it. Please provide clear steps, along with links to verify the authenticity of the solution provided.
Append vs update in Azure Data Lake Storage Gen2 for a csv file.
I have a CSV file in a Data Lake Storage Gen2. I am referring to the ACL permissions below.https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control#common-scenarios-related-to-acl-permissionsThe table gives different ACL…
Trying to connect to the Fabric Lakehouse from the SSIS Package. However it is failing when executing the package
I have Fabric Lakehouse as a Source , I am able to connect to the Fabric Lakehouse via Authentication as "ActiveDirectoryPassword" from SSMS without any issue. I am creating the SSIS package and created the ADO.NET connection Manager and…
Azure Data Lake Storage NFSv3 Protocol support for nconnect for Redhat
Hi, I have a ADLS Storage Account configured for HNS and NFSv3 protocol support. When attempting to mount the ADLS container on a Azure RedHat 9.5 NFS Client instance with the nconnect mount option set it fails with. $ sudo mount /mntfs/adls01c01…
Possible ways to extract data from SAP Reports (Transaction Data) to ADLS Gen2
Hello, I want to extract SAP ECC transaction data into ADLS Gen2. I have multiple transaction codes (T-codes) for generating reports, such as ME5A and ME2N, and I want to fetch the data from these reports into ADLS. However, it seems that Azure Data…
How to read the last ingested date partitioned file in ADLS
Hi All, How to read the last ingested file in ADLS. ADLS folder structure: XXX/year=yyyy/month=MM/day=dd Now in the day=dd I will have two different files on a given day. I want to read the last ingested file based on the timestamp as a source in my…
How to read multiple files from different folders in azure data flow?
I have multiple folders from different source systems in my landing area of ADLS GEN2. Let's say I have folders named as X,Y,Z. Inside X, I again have subfolders based on the tables from where the data has been extracted. For example X has 3 subfolders…
Data lake name sequence
Hi friends, when creating Azure Data Lake the containers are sorted by alphabetical order, it looks a bit incorrect order : bronze, landing zone gold silver I want it in the landing, bronze, silver, and gold orders. How can I do that? Is there a…
how to fix this error The request content was invalid and could not be deserialized: Required property 'name' not found in 'sku'
getting error while synapse workspace deployment
ADF pipeline to read the data from UC table to adls gen2 account
Hello Team, We have a requirement to create Azure Datafactory pipeline to read the data from UC table, access on the table is provided ( to Azure Datafactory Managed Identity) and copy the data into adls gen2. Is there a way or article to implement this?…
How cn i restrict someone from not downloading any data from Fabric Lakehouse or Fabric warehouse
As a data admin, i want to control data access for a user in your Microsoft Fabric Warehouse. The goal is to allow this user, who has a Contributor role, to view data directly in the workspace without being able to download it as a file. This scenario…
Unable to use Synapse workspace features in Synapse Link for Dataverse when Synapse network is in private mode
Hi Team, We are using Synapse Link for Dataverse for syncing data from d365 to ADLS Gen2, which can be used for analytics in ADLS Gen2. As our Synapse analytics environment network is publically disabled and using…
How to fix ParquetJavaInvocationException ParquetJavaInvocationException when writing parquet file in ADLS using ADF
Hi, I want to copy tables from an on premise sql server to my azure data lake and write the files in parquet fornat. I have installed the JRE on the machine which hosts the self-hosted IR. When I create a dataset with parquet as format and select schema…
Copy Files from sharepoint online site to azure datalake storage
Hello We are trying to setup the flow which will copy files from sharepoint online site to azure datalake storage. As per my understanding there are 2 options : Using ADF to pull the files as mentioned in the link below…