Azure Data Factory ForEach/If - Force Next Iteration
Our organization receives and loads files into an on-prem SQL Server using SSIS packages written in Visual Studio. As part of the file loads, we run a series of validations on the files to insure that certain things are correct - file name, is the file…
How to Call a ORACLE DATABASE PACKAGE from Azure data factory and receive the output of that package.
Team I need to call a ORACLE DATABASE PACKAGE from azure data factory and use its output in other activities but i am not able to call a PACKAGE. Inside the package there are many stored procedures written so i need to call a specific stored procedure…
How to fetch the child items within a folder located on azure blob
I am trying to fetch child items of a folder for recursive deletion, so my expectation is to delete the folders within the folder, like if folder A has subfolder B with file b and another subfolder AB has subfolder D, then deleting all items from…
I have a case where we need to update some records into Oracke RDS in AWS using ADF
Hi MS Team, I am working on a project where we need to perform Update from Synapse into Oracle RDS using ADF. I have ODBC linked service which I have been using to perform inserts, but no option is available to Update in the ADF pipeline. Has anyone…
ADF Dataflow Derive Column transformation toTimestamp is working for constant timezone but not from column
I am finding Azure Data Factory Dataflow Dataflow Derive Column transformation toTimestamp is working for constant timezone but not from column. I want each column to output the date with time adjusted based on the timezone UTC offset. Any suggestions…
I don't get a redirection after typing in 'az login'
I think the problem is in the first line of the error message. The redirection link is login.microsoftonline.DE instead of login.microsoft.COM. Does anyone have an idea how to fix that? I'm glad for anything that could help! PS H:\> az login The…

ADF Google Ads Linked Service using Service Authentication as the Authentication Type ERROR pkcs8PrivateKey
Trying to access Google Ads data using a Google Ads Service Account, and the ADF Google Ads Linked Service. The linked service "Authentication type" set to "Service authentication". We generated a private key for this Service…
Copy Data Activity with Upsert Operation to MongoDB Not Updating Records
I am trying to use Azure Data Factory's Copy Data Activity to copy data from a CSV file into a MongoDB collection using the upsert operation. However, I am facing an issue where the pipeline run indicates that 4 rows were written, but no records are…
Azure Data Factory and DataBricks end-to-end project
my goal is to copy data from ADLS using ADF(copy activity) and connecting that data to databricks to process and analyse the data and after completion I want save the processed data back to the storage account so, I have my dataset in my local system so…
How to dynamically switch database names in Mongo DB Linked Service?
How do i call DATABASES dynamically from a MONGO Database in Azure data factory. When trying to create a linked service for mongo DB it is asking to Hardcode the database name without making it a dynamic content. But we have more than 90 databases…
Azure Data Factory: I'm trying to understand how to get MetaData information from CSV Files used in a Data Flow to get only the latest file created?
I am using the Filter by last modified criteria in the Copy Data task with Azure Data Factory to determine the latest data file (one file created per day). The challenge I'm facing is that the file in Azure are calculated in UTC time but the files…
Azure DataFactory Hubspot linked service
Dear, We use Azure DataFactory for moving Hubspot data, using the built in Hubspot LinkedService in Azure Data factory. Hubspot announced that on 24th of March 2025 will retire Owners API v2. Link to Hubspot:…
Unable to Create Ondemand Hdinsight Hadoop Clusters in Data factory.
I am trying to create a linked service of HDInsight hadoop Clusters. With Cluster size 4, 1 hour as time to live, Provided service principal Id, Azure Key vault Secret Key, Azure Storage account and a SQL Server in Hcatalog. But I am unable to move…
ADF Switch Expression Adding Unexpected Characters to FileName
The pipeline involves a file landing in an SFTP Blob container, which is decrypted by a .Net application. After the decryption and movement to a designated folder, a pipeline is triggered to load the file into a SQL table. A Get Metadata activity…
Migrating SSIS File validation logic to ADF
I work in an environment where inbound files are loaded into an on-prem SQL Server. We currently use SSIS to loop through these files, performing multiple validations to ensure they are correctly formatted. Our validation process follows a sequential…

Complex transformations, fault tolerance, and scalability
I have the following situation, when preparing to an interview, I want to get the opinion of professionals : You are working on a data integration project where you need to ingest data from multiple on-premises SQL Server databases into Azure Synapse…
Data Factory - Unable to read zip files from Amazon S3. Error: Central Directory corrupt. Unable to read beyond the end of the stream.
I am using Azure Data Factory's Copy activity to read csv containing in zip files in Amazon S3. I am experiencing problems since yesterday about reading only zip files from Amazon S3, other types of files were read all fine (or at least pure csv files).…
How to send 15000 json data to api 1 by 1 in adf pipeline without using data flow and lookup activity?
I can not use lookup activity as it loops only for 5000 records at a time, so what are the possible ways we can send 15000 JSON data to web API, without using Dataflow and lookup activity?
Reduce Cluster startup time in ADF Dataflow
I have create a ADF pipeline which merge data between source and target in parallel fashion for 18 tables. The table name, database name etc are dynamic and assigned at runtime. The pipeline takes more than 7 min to load data and most time is taken by…
How to copy data from external volume to Oracle
I am trying to copy the parquet files from external volume to Oracle db, but I am facing this issue : Failure happened on 'Sink' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR…

