Data Copy Using AzCopy Command

Data Copy Using AzCopy Command

Hi Folks

Hope You all are doing good ,Let me bring some new topic today.

This topic is related to data migration from on-prem server to azure.

If you are engineer think out of the box ,why you will upload your big volume data using GUI.

It will take a lot of time , even i saw this problem using GUI 1TB data will take too much time.

I am not idea about time but sure ,it will go 2 days and if in between some brower or connection break come, you need to upload again and again.

Microsoft Suggest us to use PowerShell Command Line.

Done my research here and build script so that task will work as automation.

Below is the expected Speed data into Azure – up to 4 terabytes per hour        


Fast Data Transfer is a tool for fast upload of data into Azure – up to 4 terabytes per hour from a single client machine. It moves data from your premises to Blob Storage, to a clustered file system, or direct to an Azure VM. It can also move data between Azure regions        

If you are Cloud Engineer , I recommend you please start to think out of the box other wise your task will be too hard for you.

Now Question is how can any one use this AzCopy to upload and data transfer activity.

  • Step 1- Download AzCopy Extension to Client Machine (Location of high volume file)
  • Step 2- Run Powershell(IDE) as Admin
  • Step 3- Login you powershell Using AzCopy with your tenantID where you want to upload file example (azure storage account)
  • Step 4- You need to set environment variable using below PS script do some needfull change


   Add your Path as a variable to AzCopy module path

example -

$InstallPath= ‘C:\Users\Amit\Downloads’

if ($env:PATH -notcontains $InstallPath) {
$path = ($env:PATH -split “;”)
if (!($path -contains $InstallPath)) {
$path += $InstallPath
$env:PATH = ($path -join “;”)
$env:PATH = $env:PATH -replace ‘;;’,’;’
}
[Environment]::SetEnvironmentVariable(“Path”, ($env:path), [System.EnvironmentVariableTarget]::Machine)t        

Most Important Step

Run

AzCopy command


azcopy copy [source] [destination] [flags]        

Please do some need ful change as per your source and destination

Your Destination may be Azure , AWS , GCP it will work for all.

Please use SAS token to authenticate your Destination storage account.

AzCopy Work on port 80 & 443 Both are HTTPS protocol(Secure)

Do some more R&D from your End but whenever got change to Automate environment please take it as P1 Incident

You may get a lot of error during this AzCopy if you need my help , Please feel free to comment and reach out to me.

Hi Amit Jha I have gone through your article regarding azcopy , currently I am trying to copy 500 GB from onprem to container noticing lot of failures files , I see logs showing network failers-14% and I also see tcp <ipaddr>:443 connect:cannot assign requested address Could you please advise here

Like
Reply

To view or add a comment, sign in

More articles by Amit Jha

Others also viewed

Explore content categories