Hello Naresh Arvapalli,
Welcome to the Microsoft Q&A and thank you for posting your questions here.
I understand that you would like to Optimize Nested Loop Processing in ADF Pipeline for Dynamic JSON Transformation.
To solve the problem dynamically and optimize performance:
Step 1: Extract cra5a_columnname into an Array
Use a Lookup activity and a subsequent Set Variable
activity to extract only the cra5a_columnname
values. Use the following expression:
@array(activity('LookupActivity1').output.value[*].cra5a_columnname)
Step 2: Single Loop for Dynamic Transformation
Use a single ForEach loop to iterate over the Lookup Activity 2 response. Within the loop, use an If Condition
activity to check if the key exists in the extracted array. Apply the transformation logic dynamically. For an example pipeline configuration: ForEach Activity to Iterate over Lookup Activity 2
response.
@activity('LookupActivity2').output.value
If Condition Activity, then check if the key exists in the column names array.
@contains(variables('ColumnNames'), item().key)
Execute Pipeline: Dynamically transform the value.
Pass item().key
and item().value
as parameters to a child pipeline for processing.
Step 3: This is an alternative Data Flow Solution
To optimize performance for large datasets, use a Data Flow in ADF:
- Source: Read Lookup Activity 2 response.
- Derived Column: Add a new column with the transformation logic:
- Use an
exists()
function to check if the key matches anycra5a_columnname
. - Apply transformations only if the key exists.
- Sink: Write the transformed data to the desired location. For an example of a dynamic mapping expression:
if(exists(cra5a_columnname_array, currentColumnName), toInteger(currentColumnValue), currentColumnValue)
I hope this is helpful! Do not hesitate to let me know if you have any other questions.
Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.