How to copy all tables of a Schema to .csv file format in respective folders

Zaeemul Haque 45 Reputation points
2023-12-22T21:18:17.5066667+00:00

Hi,

I am trying to copy all tables of Schema-SALES from SQL Server to .csv file format in ADF and files of each table must copy in respetive folder

for example.

source (SQL SERVER Tables)

          SALES.Customer 

          SALES. Address

          SALES. SalesPerson

           SALES. YearSales 

            etc

Target in ADF-ADLS-GEN2 (.csv format), output will be as below

    /Maindirectory/SALES/Customer/customer.csv

    /Maindirectory/SALES/Address/address.csv

    /Maindirectory/SALES/SalesPerson/SalesPerson.csv

      /Maindirectory/SALES/YearSales/YearSales.csv

     etc 

I beleive it can be done by dynmically but I have no idea because I am just new to AZURE, Please let me know any link or document, or Steps that I can use it.

Thanks for your help.

Azure SQL Database
Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
5,153 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,012 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,909 questions
{count} votes

Accepted answer
  1. Aswin 372 Reputation points
    2023-12-23T09:06:23.0966667+00:00

    To copy dynamically all the tables of schema='Sales' from SQL server to datalake storage, you can take the lookup activity which retrieves the all the required tables and then take the for-each activity and inside the for-each activity, take the copy activity to copy each table that are listed in lookup activity output. Below is the detailed approach.

    • Take the lookup activity with SQL Server dataset. Select the query option from the use query options. Give the query as
    
    SELECT table_name
    
    FROM information_schema.tables
    
    WHERE table_type = 'BASE TABLE' and table_schema = 'SALES'
    
    

    The above query will give the table names where schema is 'SALES'.

    • Then take the for-each activity and connect it with lookup activity on its success. In the settings of for-each activity, give the expression for items as @activity('Lookup1').output.value. This expression will return the value of previous lookup activity output as array.

    • Then inside for-each activity, add copy activity. In source settings of copy activity, add SQL Server dataset. In dataset connection, add the file name @item().TableName as dynamic content.

    Similarly in the sink settings, add the datalake gen2 delimited dataset. In sink dataset connection, add the folder and file name @item().TableName as dynamic content.

    This way, you can copy dynamically all the tables to destination container.

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Zaeemul Haque 45 Reputation points
    2023-12-29T03:07:06.57+00:00

    Thank You very much Aswin, your screenshot was very helpful.

    I already completed copying tables SQL Server to ADLS Gen2 in .csv files by above mentioned process.

    I am just wondering how to do incremental or delta copy or everyday by using the same process, please let me know your suggestion.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.