It looks like you're aiming to update existing records in the Account table of your Azure SQL Database using Azure Data Factory (ADF). Since ADF's Copy Activity is primarily designed for inserts, using it for updates can be challenging and it requires a different approach. Below are two possible solutions:
Approach 1 - Using Stored Procedure (For Small to Medium Data Volumes)
Fix the Source Query - Ensure your query correctly filters records by using -
SELECT Customer_Identifier__c, N__r.Name, V__r.Name, Pg_Team_Tg_Name__c
FROM Account
WHERE Customer_Identifier__c IS NOT NULL AND D__c = '1234'
Create a Stored Procedure in Azure SQL Database: Instead of using an update query inside the Copy Activity, use a Stored Procedure for better efficiency:
CREATE PROCEDURE UpdateAccountDetails
@Customer_Identifier NVARCHAR(255),
@N NVARCHAR(255),
@V NVARCHAR(255),
@Pg_Team_Tg_Name NVARCHAR(255)
AS
BEGIN
UPDATE account
SET N = @N,
V = @V,
Pg_Team_Tg_Name = @Pg_Team_Tg_Name
WHERE customer_identifier = @Customer_Identifier;
END
Configure ADF Pipeline -
- Lookup Activity - Fetch records from Salesforce.
- ForEach Activity - Iterate over each record.
- Stored Procedure Activity - Inside ForEach, call the
UpdateAccountDetails
procedure for updates.
This method works well for a small dataset but may be inefficient for large updates.
Approach 2 - Using Mapping Data Flow (For Large Data Volumes)
For larger datasets, a more efficient way is to use a staging table and perform a bulk update via MERGE
.
Load Data into a Staging Table First Instead of updating directly, first, copy Salesforce data into a staging table (e.g., Account_Staging
).
Use a MERGE Statement for Bulk Updates.
MERGE INTO account AS tgt
USING (SELECT * FROM Account_Staging) AS src
ON tgt.customer_identifier = src.Customer_Identifier
WHEN MATCHED THEN
UPDATE SET
N = src.N,
V = src.V,
Pg_Team_Tg_Name = src.Pg_Team_Tg_Name;
Configure ADF Pipeline -
Copy Activity → Load Salesforce data into Account_Staging
.
Stored Procedure Activity → Execute the MERGE
statement to update records in bulk.
(Optional) Delete Staging Table Data → Truncate Account_Staging
after processing.
This approach is more efficient than row-by-row updates, especially for large datasets.
Alternative - Mapping Data Flow (No Stored Procedure)
If you prefer a no-code approach, use Mapping Data Flow -
Source → Pull data from Salesforce.
Lookup → Fetch existing data from Azure SQL.
Alter Row Transformation → Mark records as "Update" based on customer_identifier
.
Sink → Use "Allow Upserts" to update efficiently.
Recommended Approach
For small updates, use a Stored Procedure with ForEach.
For large updates, use Staging Table + MERGE or Mapping Data Flow for better performance.
I hope this information helps. Please do let us know if you have any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.