Azure Storage Account Tiering and Automation

MJ-1983 426 Reputation points
2025-02-27T15:56:10.2+00:00

We have an existing Azure storage account with a performance tier of Standard and an account kind of StorageV2 (general purpose v2).

Under Data Storage -> File Shares,

We have an existing SMB file share that contains active and historical data spanning 10 years. Our requirement is to keep data from years 1-5 in the Cool tier and historic data from years 6-10 in the Archive tier. We are unsure of the current tier for our SMB file share. Additionally, how can we automate this data tiering every year? Please note that we might update historical data locally. In this case, how can we ensure the historical data in the Archive tier remains up to date?

User's image

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,393 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Luciano Valinotti 0 Reputation points
    2025-02-27T16:35:51.9566667+00:00

    To check the current tier of your SMB file share, you can look at the properties of the file share in the Azure portal. For automating the data tiering, you can use Azure Blob Lifecycle Management policies, which allow you to move data between tiers based on age or other criteria. You can set up rules to automatically move data to the Cool tier for years 1-5 and then to the Archive tier for years 6-10.

    If you update historical data locally, you can use these same policies to ensure that new versions of the files in the Archive tier are kept up to date by reapplying the lifecycle rules.

    Regards

    0 comments No comments

  2. Keshavulu Dasari 3,790 Reputation points Microsoft Vendor
    2025-02-28T11:50:13.11+00:00

    Hi MJ-1983,

    Yes You are right that SMB file shares can be mapped as network drives on local machines, which isn't directly possible with Blob storage, there are several tools and methods you can use to copy or mirror local data to Azure Blob storage,

    AzCopy is a powerful command-line tool designed for copying data to and from Azure Blob storagetorage. It supports high-performance data transfer and can be scheduled using scripts or task schedulers.

    Azure Data Factory: Azure Data Factory is a cloud-based data integration service that allows you to create data pipelines for moving and transforming data. It can be used to copy data from on-premises to Azure Blob storage. ADF provides a graphical interface and supports various data sources and destinations.

    Robocopy with Azure Blob Storage: While Robocopy itself doesn't directly support Azure Blob storage, you can use it in combination with tools like AzCopy. For example, you can use Robocopy to copy data to a local staging area and then use AzCopy to transfer the data to Azure Blob storage.

    Resilio Connect is a real-time, cross-platform file synchronization tool that supports Azure Blob storage. It offers high-speed data transfer and synchronization capabilities, making it a robust alternative to AzCopy.

    Rclone is an open-source command-line program that supports various cloud storage providers, including Azure Blob storage. It offers features like data synchronization, copying, and mounting cloud storage as a local drive.

    example of using AzCopy to copy data from a local directory to Azure Blob storage:

    azcopy copy "C:\local\path" "https://<storage-account-name>.blob.core.windows.net/<container-name>?<SAS-token>" --recursive
    

    For ongoing synchronization, you can set up a scheduled task or cron job to run the AzCopy command at regular intervals.


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.           

    User's image

    If you have any other questions or are still running into more issues, let me know in the "comments" and I would be glad to assist you.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.