Hey,
Ideally yes it is possible via ADF to copy that much data :
https://learn.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime-performance
You can even use SHIR based on the necessary CPU consumption.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Hi Team,
I have one use case where I need to copy 500GB single file from sFTP location to AWS S3 bucket using Azure iPaaS services.
I know ADF doesn't support S3 bucket as sink as of now but I'm planning to put that large file in AWS sFTP which can drop it in S3 bucket. I'm planning to use ADF copy activity. For that I need to know if ADF can handle 500 GB.
Please help,.
I'm open to other Azure Service which is fit for this purpose.
Regards,
Partha
Hey,
Ideally yes it is possible via ADF to copy that much data :
https://learn.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime-performance
You can even use SHIR based on the necessary CPU consumption.