D365 Outbound Marketing Data sync to AdLS Gen2 cannot be synced inside a folder structure but only on conatiner.

Rabin Chaudhary 1 Reputation point
2025-01-13T03:18:27.1333333+00:00

I wanted to copy Market Outbound Data from CRM like email sent, delivered, Email clicked, Email Opened , etc from the marketing campaign to ADLS Gen 2 using Data Factory but seems like I cannot use ADF to do that as they are calculated live. Also, I heard its a heavy task for api to copy this items.

Also, I found there is an another way which is using synchronization migration from CRM directly to ADLS Gen 2 by setting up the with the API token from storage account. But When I did that I was only able to save in the ADLS Gen 2 by creating a container but not in the nested folder.

Q1 . Can we use D365 connector in ADF to copy this types of data if not why?
Q2. Can we save the data in a nested structure rather in a container as it breaks our current Lakehouse Setup as I have to create extra container other than hot, cold, and archive?

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,531 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,164 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Vinod Kumar Reddy Chilupuri 2,310 Reputation points Microsoft Vendor
    2025-01-13T08:44:58.2233333+00:00

    Hi @Rabin Chaudhary

    Welcome to Microsoft Q&A, thanks for posting your query.

    The D365 connector in Azure Data Factory (ADF) is primarily designed to work with entities that are stored in a structured format within the D365 environment. However, the specific data you mentioned (like email sent, delivered, clicked, opened, etc.) is often calculated or aggregated in real-time, which can pose challenges for direct extraction using ADF.

    • The data you are trying to extract may not be stored as static records in D365 but rather calculated on-the-fly based on user interactions. This means that the D365 connector may not be able to access this data directly as it does not exist in a traditional entity format.
    • If the data is accessible via the D365 API, it may involve heavy API calls, which can lead to performance issues or rate limiting. This is especially true if you are trying to extract large volumes of data or if the API is not optimized for bulk data extraction.

    please refer the below documentation

    https://learn.microsoft.com/en-us/azure/data-factory/connector-dynamics-crm-office-365?tabs=data-factory

    Yes, it is possible to save the data in a nested structure rather than in a container in ADLS Gen 2. To do this, you can specify the folder path in the sink dataset of your ADF pipeline.

    • For example, if you want to save the data in a folder called "Marketing" within the "hot" container, you can specify the folder path as "hot/Marketing" in the sink dataset.
    • However, it's important to note that creating a nested structure in ADLS Gen 2 can impact the performance of your data lake. This is because ADLS Gen 2 is optimized for large-scale data storage and retrieval, and creating a large number of nested folders can impact the performance of queries and data processing. It's recommended to keep the folder structure as flat as possible to optimize performance.

    Hope the above answer helps! Please let us know do you have any further queries.


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members. 


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.