ADF MDX query pipeline failure at source side

Nyan Lynn Tun 0 Reputation points
2024-10-25T02:42:31.24+00:00

I'm trying to load one month data of a report from SAP BW using SAP with MDX linked service on ADF. 

When I load only two or three dimensions as testing, it works, but when I add more dimension in the MDX to replicate the report, the debugging failed with this error. 

Operation on target copy data from sap bw to lakehouse failed: Failure happened on 'Source' side. ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SAP Business Warehouse: RfcInvoke failed(RFC_ABAP_RUNTIME_FAILURE): TSV_TNEW_PAGE_ALLOC_FAILED,Source=Microsoft.DataTransfer.ClientLibrary.MashupSource,'

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,047 questions
{count} votes

1 answer

Sort by: Most helpful
  1. RevelinoB 3,435 Reputation points
    2024-10-25T04:29:10.05+00:00

    Hi Nyan,

    I recently encountered a similar issue with one of our customers while working on an SAP BW integration using MDX in ADF. Initially, the data load worked fine with just a few dimensions, but as more dimensions were added to replicate the report, the process failed with the exact same error: TSV_TNEW_PAGE_ALLOC_FAILED. This happens when the SAP system runs out of available memory to process complex or large queries.

    Here’s what worked for us:

    Reducing Data Volume:

    We started by limiting the number of dimensions and measures in the MDX query. By testing smaller sets, we found a balance that worked within the system's capacity. Adding filters (like limiting the date range) also helped reduce the data volume significantly.

    Splitting the Query into Chunks:

    Instead of fetching the entire month's data in one go, we broke it into smaller time-based chunks, such as weekly or even daily data, and combined the chunks later in the data lake. This approach eased the memory load on SAP BW.

    Optimizing the Query in SAP BW:

    We ensured that the query was optimized. At the customer’s site, we implemented aggregates and summarized the data at a higher level before running the query. This reduced the amount of data processed by SAP BW.

    SAP Memory Tuning (Basis Involvement):

    The SAP Basis team adjusted the system’s memory parameters to allocate more memory for the process. Specifically, they increased settings like abap/heap_area_dia and other relevant parameters. This adjustment resolved the memory allocation issue.

    Checking SAP System Logs:

    We analyzed the system logs through transaction ST22 to understand where the memory issue originated. It turned out that certain steps in the query were consuming excessive memory, so we optimized those areas specifically.

    Asynchronous Processing in ADF:

    To further alleviate pressure, we shifted to batch processing in ADF, breaking down the data loads and running them asynchronously. This approach gave the SAP BW system more breathing room without overwhelming it with a single large request.

    Adjusting ADF Timeout Settings:

    We also adjusted the timeout settings in ADF to give SAP BW enough time to process and respond, especially with large datasets. This minor change made a big difference when dealing with high data volumes.

    By implementing these strategies, we successfully resolved the issue and enabled smooth data loading from SAP BW to the Lakehouse.

    Let me know if this resonates with what you’re experiencing or if you need any further insights!

    I hope this helps with your query.

    1 person found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.