Can Azure Data Factory copy data activity handle 500 GB data?

Pratim Das, Partha C 326 Reputation points
2025-01-31T09:00:29.76+00:00

Hi Team,

I have one use case where I need to copy 500GB single file from sFTP location to AWS S3 bucket using Azure iPaaS services.

I know ADF doesn't support S3 bucket as sink as of now but I'm planning to put that large file in AWS sFTP which can drop it in S3 bucket. I'm planning to use ADF copy activity. For that I need to know if ADF can handle 500 GB.

Please help,.

I'm open to other Azure Service which is fit for this purpose.

Regards,

Partha

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,215 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Nandan Hegde 34,021 Reputation points MVP
    2025-01-31T09:57:44.5966667+00:00

    Hey,

    Ideally yes it is possible via ADF to copy that much data :

    https://learn.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime-performance

    You can even use SHIR based on the necessary CPU consumption.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.