Azure: Uploading Large Data (Over 50GB) to a Storage Account Container

정 근오 0 Reputation points
2024-11-09T03:35:34.2466667+00:00

Hello,

Thank you for providing such a great service.

While using Azure at my company, I encountered a question I’d like to ask.

I’m trying to upload training image data over 50GB from my laptop (local) to a container in a storage account, but it seems that drag & drop isn’t sufficient.

Could you please advise on the most efficient way to perform this upload?

Thank you very much.

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,244 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,943 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Deepanshukatara-6769 10,765 Reputation points
    2024-11-09T06:36:49.81+00:00

    Hello , Welcome to MS Q&A

    I think you will need to use the azcopy utility, as a single PUT blob call cannot exceed 5 GB. Here are the steps:

    • Download azcopy from the Azure website.
    • Use the following command to upload your file:
    azcopy copy [source] [destination] [flags]
    
    
    

    Please go through below docs for complete knowledge and steps

    References:

    Kindly accept if it helps

    Please let us know if further questions

    Thanks

    Deepanshu

    0 comments No comments

  2. Hari Babu Vattepally 715 Reputation points Microsoft Vendor
    2024-11-11T07:54:50.97+00:00

    Hi @정 근오

    Welcome to Microsoft Q&A Forum. Thanks for posting your query here!

    To upload large amount of data (over 50 GB) effectively to an Azure Storage container, you may consider using Azcopy tool to Upload files to Azure Blob storage by using AzCopy from on-premises or cloud (Use this command-line tool to easily copy data to and blobs from Azure Blobs, Blob Files, and Table Storage with optimal performance.) AzCopy supports concurrency and parallelism, and the ability to resume copy operations when interrupted. It provides high-performance for uploading, downloading larger files.

    If you are using drag and drop and encounter issues, consider using bulk copy programs like Robocopy guide for Robocopy or rsync. These tools are recommended for transferring large files as they provide additional resiliency and can retry operations in case of intermittent errors.

    If you want to upload larger files to file share or blob storage, there is an Azure Storage Data Movement Library.

    Choose an Azure solution for data transfer: This article provides an overview of some of the common Azure data transfer solutions. The article also links out to recommended options depending on the network bandwidth in your environment and the size of the data you intend to transfer.

    If network constraints make uploading data over the internet impractical, you can use Azure Data Box devices to transfer large datasets. You can copy your data to these devices and then ship them back to Microsoft for upload into Blob Storage.

    Upload large amounts of random data in parallel to Azure storage

    Hope this helps in resolving your problem. If the issue still persists, please feel free to contact Microsoft Q&A Forum. We will be glad to assist you closely.

    Please do consider to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members. 

    User's image


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.