I am integrating the dataflow activity which removes duplicates from the csv file to the existing pipeline i have. I need a help to configure the parameters.

Lakshmi Moulya Nerella 0 Reputation points
2025-01-24T08:02:14.5133333+00:00

I created a dataflow activity which removes duplicates from the source files( csv formatted files). So now i integrated this dataflow activity to existing pipeline i have. but i am not able to parametrize the source files.

User's image

User's image

It would be helpful if i get some options on this to solve my issue.

without attaching to the pipeline it's working completely fine i am getting the proper count without duplicates. But after integrating the datflow activity i am not able to provide the source options and the particular files i want.

Azure SQL Database
Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,531 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,338 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,047 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,164 questions
{count} votes

1 answer

Sort by: Most helpful
  1. phemanth 13,150 Reputation points Microsoft Vendor
    2025-01-24T10:29:06.3033333+00:00

    @Lakshmi Moulya Nerella

    Thanks for using Microsoft Q&A forum and posting your query.

    you're on the right track with your dataflow activity! To help you parameterize the source files in your existing pipeline

    I have reproduced from me to parametrize the data flow to an existing pipeline please follow the below steps and let me know if it works

    Create Parameters in Your Data Flow:

    • Open your data flow and click on the blank portion of the canvas to access the general properties. User's image
    • In the settings pane, go to the Parameters tab and click New to create a new parameter.
    • Assign a name, select a type (e.g., string), and optionally set a default value.

    Use Parameters in Your Data Flow:

    • You can reference these parameters in any data flow expression. Parameters start with a $ sign.
    • For example, if you have a parameter named sourceFilePath, you can use it in your source transformation settings.

    Assign Parameter Values from the Pipeline:

    • In your pipeline, add an Execute Data Flow activity.
    • In the activity's Parameters tab, you will see the parameters you created in your data flow.

    User's image

    User's image

    • Assign values to these parameters using either the pipeline expression language or the data flow expression language.

    For more detailed guidance, you can refer to the official documentation on parameterizing mapping data flows

    I hope the above steps will resolve the issue, please do let us know if issue persists. Thank you


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.