Copying Blobs Between Different Azure Tenants with AzCopy
This article explains how to use AzCopy to copy blobs between different Azure tenants, including solutions for public and private storage accounts, custom routes, automation, and security measures.

Copying Blobs Between Different Azure Tenants with AzCopy
AzCopy
AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account. This article provides a detailed guide on how to download AzCopy, connect to your storage account, and transfer data. For more information, see the AzCopy documentation.
Transferring blob files between storage accounts in Azure can be done in various ways. In this article, I will show you how to copy blobs between two different Azure tenants. The following scenarios are considered:
- Publicly accessible storage accounts
- Storage accounts behind a custom FrontDoor route
- Storage accounts behind a custom Application Gateway route
For all examples, we use user delegation keys to generate SAS tokens. This allows us to use more secure temporary permissions instead of the storage account-specific account keys.
"The SAS token is a string that you generate on the client side, e.g., using one of the Azure Storage client libraries. The SAS token is not tracked by Azure Storage in any way. You can create an unlimited number of SAS tokens on the client side." (Learn more)
Why Not Use the Azure Blob SDK?
Using the Azure Blob SDK is not practical in certain scenarios, especially when the source and target storage accounts are in different tenants or regions.
Example: You have a storage account in the "France Central" region and want to replicate data to "China North 3." Since the Chinese Azure infrastructure is physically and logically separate from the global Azure cloud, such scenarios cannot be directly covered by the Azure Blob SDK.
For more information about the differences between the global Azure infrastructure and the Chinese Azure cloud, see Azure China Connectivity.
Compliance requirements may also prevent setting up peering between European and Chinese infrastructures. In such cases, an independent solution like AzCopy is required.
Solution 1: Public Storage Accounts
If your organization allows the use of public storage accounts, this is the simplest option. In the following example, we use the AzCopy service to copy blob files between storage accounts.
For more information, see the AzCopy documentation for copying blobs.
azcopy copy \
--overwrite=ifSourceNewer \
--check-md5=FailIfDifferent \
"https://{sourceStorageAccountName}.blob.core.windows.net/{containerName}/{yourFilePath}?{sasToken}" \
"https://{targetStorageAccountName}.blob.core.chinacloudapi.cn/{containerName}/{yourFilePath}?{sasToken}" \
--from-to BlobBlob \
--recursive
This AzCopy command also works for syncing entire containers or folders in public storage accounts. In this case, you specify container or folder paths instead of specific file paths, and AzCopy handles the synchronization of storage accounts. You receive detailed information about already synchronized files and a comprehensive synchronization status report.
Note: This method is not allowed in many organizations because storage accounts are typically only accessible via private networks or specific access mechanisms such as FrontDoor or Application Gateway.
Solution 2: Storage Accounts Behind Custom FrontDoor or Application Gateway Routes
Another solution for accessing a storage account is to use custom routes for an Azure FrontDoor or an Azure Application Gateway.
Instead of using the default storage account domain ".blob.core.windows.net" or ".blob.core.chinacloudapi.cn," we use a custom domain of our organization. In my example, we go a step further and assign each storage account a custom path-based route in FrontDoor and Application Gateway.
azcopy copy \
--overwrite=ifSourceNewer \
--check-md5=FailIfDifferent \
"https://{my-custom-domain.de/mysubpath}/{containerName}/{yourFilePath}?{sasToken}" \
"https://{my-china-domain.cn/mysubpath}/{containerName}/{yourFilePath}?{sasToken}" \
--from-to BlobBlob \
--recursive
As you can see, the actual AzCopy command has hardly changed. The only modification is replacing the default storage account domain with our custom route.
In our examples, we have exclusively focused on using SAS tokens for authentication with our storage accounts in Europe and China.
To explore further alternatives, see: AzCopy and Azure Active Directory.
Automation and SAS Token Management
Manually executing the commands can be inefficient. I therefore recommend developing a small REST application that performs the following tasks:
Automatic Generation of SAS Tokens:
The application uses an Azure Managed Identity with the necessary permissions (e.g., Storage Blob Data Contributor
) to retrieve a user delegation key and create SAS tokens.
Scheduling and Monitoring Jobs:
Using the AzCopy command azcopy jobs list
, you can monitor ongoing and completed jobs. This information can be utilized in your application to track progress and generate reports.
Security Measures:
- SAS tokens should have a maximum validity period of 15–30 minutes to minimize the risk in case of a leak.
- Use least privilege permissions for user delegation keys to restrict access to specific containers or actions.
Conclusion
AzCopy is a powerful tool for efficiently and securely transferring data between storage accounts – even in complex scenarios such as using different Azure tenants or regions. By combining AzCopy with custom routes and automated applications, you can create robust solutions for your data replication needs.
Another aspect to consider in the future is the additional costs that may arise from using FrontDoor and Application Gateway instances for synchronizing storage accounts.
Additional useful resources: