ADF Pipeline Fails with Error ‘StatusCode: DFExecutorUserError, Message: Job Failed Due to Reason: Failed to Create a Child Event Loop’ or Succeeds but Does Not Write Any Data at Target
Hi team, My ADF pipeline recently started failing with the following error. My pipeline takes data from Cosmos DB, transforms it, and dumps it as a CSV file into a storage account. Error Message: Operation on target Data_Transfer failed:…
Error loading CSV into Oracle using Copy Data Activity
Trying to load a CSV file into Oracle. I am able to import the schema and map the source to the target. When I go to run the pipeline, I get the following error: UserErrorOdbcOperationFailed Failure happened on 'Sink' side.…
How do I dynamically read multiple file names?
I am trying to read name of about 20+ csv files, and ingest data dynamically. Below is diagram of how I ingest data from each csv file individually. So, I have at least 20+ pipelines for each csv files. Below is how each csv files' name look like: If I…
How to make build preview
Hi, Please find the docuements - https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-improvements To use the build-preview, it is necessary to add build-preview to package.json. However, it does not give any clues how to…
How can I update the data source (Azure Storage) for Azure Analysis Services without using the SQLServer module or the Invoke-ASCmd method?
Can I update the data source (Azure Storage) for Azure Analysis Services (AAS) without relying on the SQLServer PowerShell module i.e using the Invoke-ASCmd method to make the changes? Are there alternative approaches or tools available to update the…
CDC support for Azure PostgreSQL flexible server
Hi, The current ADF do not support Azure PostgreSQL flexible server under Change Data Capture. I this going to be supported in future ? If why when can it be expected ?
How to Pass File Name from Copy Data Activity to Next Transform Activity ?
Hi, In copy data activity, I am generating a file by extracting data from a DB source. In the copy data activity, I am sinking this to Azure Blob storage under a newly generated file with naming convention…
New ServiceNow Connector in ADF(Unable to pass the SQL query in Source side)
Hi Team, How Can I pass the SQL query in ServiceNow dataset while using ServiceNow as Source in ADF Copy Activity As New ServiceNow connector does not support SQL query as per Microsoft docs. It supported in legacy connector so I passed the SQL query…
Unable to flatten a complex json object using mapping data flow flatten activity
Hi all, I have a json file with this…
Get data from Azure SQL Database Managed Instance(Linked Service) using Synapse notebook spark pool
I have a linked service from Azure SQL Database Managed Instance and I want to get the data via Synapse spark pool In simple words I want to connect to the Azure SQL Database Managed Instance(Linked Service) and get the data via Synapse Notebook spark…
How can I update the data source (Azure Storage) for Azure Analysis Services without using the SQLServer module or the Invoke-ASCmd method, preferably using Python?
I am looking for a way to achieve this using Python. Please suggest any tools, libraries, or approaches that can facilitate this, such as using REST APIs or handling XMLA directly. A step-by-step explanation or references would be greatly appreciated.
How to store json data in csv of API which is authenticated using grant_type = password in ADF
We have requirement where we need to generate token using grant type as password and use that token to call API to get data. We need to store this data in CSV file in ADLS. We are able to get the response but are unable to store it in proper json…
Cannot connect to Azure SQL source from mapping data flow
Hello Microsoft Support, I'm experiencing an issue with Azure Data Factory when executing a Mapping Data Flow that involves a Calendar_Sql source. The Spark job fails with the following error message: Spark job failed: { "text/plain":…
Error: 401 'Microsoft.IdentityModel.Tokens.AudienceUriValidationFailedException' was thrown.
I have used all the correct values here, and I'm receiving an access token from the code below I have given this amount of API permission (for testing purposes ofcourse) but when I access it through the endpoint below: …
Copy Storage Account to a different resource group.
Hello there, I plan to copy my storage account to a different resource group using the azcopy command. The document below recommends appending the SAS key to the source or destination URL in the command if the authentication method is not Microsoft Entra…
A pipeline in Azure Data Factory loses certain fields, yet a newly created pipeline with the same setup does not.
In Azure Data Factory, a pipeline consists of two activities: one is a Web Activity and the other is a Copy Data Activity. The pipeline first acquires a token through the Web Activity and then uses the token to retrieve data from the Synapse database.…
Unable to Apply Certain Filter Conditions (e.g., TriggerName, TriggerBy, Status) in ADF monitor URL Query Parameters
I need a link that navigates to the ADF Monitor page and automatically applies filter conditions, such as triggername, from another site. Currently, I can use a URL like the following to pre-fill the pipelinename filter field on the ADF Monitor…
Unable to copy payload from snowflake sql rest api to Azure blob
Hi , I am using web activity to get oauth token and passing subscription key and token to get method source dataset in a copy activity. postman is giving correct result but getting below error:- "Code": 22755, "Message":…
SMB 3 Implementation in .net core
There isn't any library available to implement network file sharing using SMB3 protocol in .net core. The existing SMBLibrary only supports SMB1 and SMB2. Which library to use for SMB3 implementation?
Copy Files from sharepoint online site to azure datalake storage
Hello We are trying to setup the flow which will copy files from sharepoint online site to azure datalake storage. As per my understanding there are 2 options : Using ADF to pull the files as mentioned in the link below…