Incomplete Files from Copy Data Command in Azure Data Factory pipeline when uploading data from Snowflake
I am experiencing an issue where the file-sink of the Copy Data command (SnowflakeExportCopyCommand) is producing incomplete files when uploading data from Snowflake to Azure Blob Storage in our Azure Data Factory pipeline. Observations: The number of…
Azure Data Factory: additional column in copy data activity reuse column values in a function?
I am struggling with an Azure Data Factory pipeline action which is a copy data. I have an input CSV file which I need to copy over to a blob storage as a CSV file but one of the column's values must get URLEncoded. Problem which I currently have is that…
How to express wildcard character(s) in Azure Logic Apps
I have following file name in Azure File Storage: I was trying to use following Expression to grab today's file in Azure Data Factory: But, somehow, I am getting error saying it would not recognize the file: I am wondering how to express wildcard…
What Azure Service to use to connect to ZOHO CRM to work with Azure AI backend (SaaS product) we are building
Looking to connect and fetch user data from ZOHO CRM to respond to customer queries via chat interface. Like Copilot type of service. What is the best way to connect to ZOHO CRM API via Azure AI service?
ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of the property 'columns' is invalid: 'Value cannot be null. Parameter name: columns'.,Source=,''Type=System.ArgumentNullException,Message=Valu
Hi, Am new to Postgresql, am trying to fetch the column list and their respective data types for a Postgresql table using a lookup activity within a foreach container in ADF . Query is working fine and fetching the expected output when executed in Azure…
Trigger in ADF
I am working on an ADF pipeline that pulls data from an external SQL database. Instead of running the pipeline on a fixed schedule, I want it to be triggered automatically whenever new or updated data is available in a specific table within the SQL…

Get file names from different folders and combine them into one variable in ADF
I have 100 folders: f1-f100, each folders have many different csv files. How to get all the csv files' names together,in a list, like results in one variable? Thanks a lot source in a storage blob container like below: f1: test1.csv, test2.csv …
Azure Data Factory ForEach/If - Force Next Iteration
Our organization receives and loads files into an on-prem SQL Server using SSIS packages written in Visual Studio. As part of the file loads, we run a series of validations on the files to insure that certain things are correct - file name, is the file…
ADF Google Ads Linked Service using Service Authentication as the Authentication Type ERROR pkcs8PrivateKey
Trying to access Google Ads data using a Google Ads Service Account, and the ADF Google Ads Linked Service. The linked service "Authentication type" set to "Service authentication". We generated a private key for this Service…
Netezza - Linked service has an error Payload [Unrecognized payload received from server]
Good Day, A pipeline that is using a Netezza connector stopped working. [ERROR [HY000] [Microsoft][Netezza] (2000) Unrecognized payload received from server. Opcode: 100 ERROR [HY000] [Microsoft][Netezza] (2000) Unrecognized payload received from…
ADF Pipeline Issue from Snowflake to Azure SQL data copy
Hi Team, Im trying to create function in Azure SQL using ADF Pipeline,Some times My pipeline is getting success and sometimes its failing with below error.Please provide me solution for the below issue. operation on target ITERATE_TABLES_LOAD failed:…
How to parametrize in the arm-template-parameters-definition.json the referenceName for Linked Service Reference in the Datasets part.
I would like to parametrize in the arm-template-parameters-definition.json the "referenceName" for linked service in the datasets part of the file. You can have a look at the below image the part I want to change in order the production…
Microsoft Fabric API Authentication & Deployment Issue | Azure DevOps Pipeline
Dear All, We have configured Microsoft Fabric in our environment, with dedicated workspaces for Dev, UAT, and PROD. Instead of using Deployment Pipelines, we have integrated our workspaces with an Azure DevOps Git repository branch for version…
How to delete a dev branch?
How to delete my dev branch?
ADF Dataflow issue with source API query
am trying to setup an Azure Data Factory Dataflow. The source is a REST API from Monday.com which uses GraphQL. The API requires POST method, with 3 headers Content-Type "application/json", API-Version "2025-01 " and Authorization:…
adf_publish an Main branch in ADF
hi, I would like to understand about adf_publish branch. Wat is this branch used for? what code it contains? Main branch is the final or producation ready branch which contains the latest and working copy of all pipelines, dataflows and datasets. Why we…
iif condition in Expression builder in ADF Alter row
Hi Team, In my source file, i have country column there i have list of countries. In my source data country name Spain is now changed to "Espana". for this i want to write update condition in AlterRow condition. I tried following…

Failure happened on 'Sink' side. ErrorCode=UserErrorKeyConflict,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Key Content-Type conflicts in AuthHeaders/AdditionalHeaders,Source=Microsoft.DataTransfer.ClientLibrary,'
I am trying to post file using rest connector, but it is not allowing to add additional headers for Content-Type - as it is by default applying content-type as application/json, what if, i need to use different content-type. I believe it should be…
Error while Unzipping .tar.gz files using Copy activity - Failing for csv file having 8GB of data
Hi, I am trying to uncompress .tar.gz files using Copy Activity and it was working fine till yesterday. Today we have received a file with more data and it got failed while copying one file with 8GB of data with following error…
Data Factory - Unable to read zip files from Amazon S3. Error: Central Directory corrupt. Unable to read beyond the end of the stream.
I am using Azure Data Factory's Copy activity to read csv containing in zip files in Amazon S3. I am experiencing problems since yesterday about reading only zip files from Amazon S3, other types of files were read all fine (or at least pure csv files).…