0
votes

I have several Azure accounts. I want to copy a big blog (250GB vhd) from one account to another account, without downloading and then uploading to/from a local machine.

I tried using the Microsoft utility AZCOPY to do this (keys replaced by x's):

azcopy https://accountfrom.blob.core.windows.net/neo4j/neo4j-250gb.db.vhd https://accountto.blob.core.windows.net/neo4j/neo4j-250gb.db.vhd /DestKey: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx /SourceKey:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

But this gives me the error message: Error parsing destination location The remote server returned an error: (403) Forbidden.

I tested the keys and accounts by opening the accounts in CloudBerry. I got the urls from CloudBerry as well, so I think I got those right as well.

What could be the cause of the 403?

7
Sounds like the key is wrong for the destination. Or, maybe that key doesn't have write access?Peter Ritchie
One thing I would recommend is to trace your request/response through a tool like Fiddler. You'll get more details about the 403 error. Usually 403 error means an issue with the key as @PeterRitchie mentioned.Gaurav Mantri
Tried both of your suggestions, to no avail. I've given up on AzCopy, and wrote a solution in PowerShell.user1147862
Does you destination have sufficient free storage?Kami

7 Answers

3
votes

I tried Gaurav's suggestion of using Fiddler. This allowed me to see the XML response from Azure Storage. There I saw "AuthenticationErrorDetail: Request date header too old" Turns out I was using a virtual machine and its time was behind. Updating the clock on the VM fixed the authorization problem.

1
votes

AzCopy command line pattern is "azcopy [source] [dest] [file pattern] [options]", the [source] will be treated as a folder (if copy from local folder) or virtual directory (if it is copied from a blob), that is, AzCopy will copy all the files under the source folder/virtual directory.

So in your command line, azcopy will try to find a virual directory equal to 'xxxx.core.windows.net/neo4j/neo4j-250gb.db.vhd', but we know it is not a virual directory but a file.

To copy a single file from the blob, you can try below command with [file pattern] and option /s.

azcopy accountfrom.blob.core.windows.net/neo4j/ accountto.blob.core.windows.net/neo4j/ /sourcekey:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx /destkey:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx neo4j-250gb.db.vhd /s

Attention, the file pattern was different between 'copy from blob' and 'copy from local folder'. When copy from blob, the file pattern was treat as a prefix, and when copying from local, the file pattern is a general file system file pattern.

E.g. you can use file pattern ab* when copy from local to specify all files start with 'ab', but when copy from blob, you can only specify the prefix like 'ab'.

For how to use option /s, please go to http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx

For the error '403 forbidden', it is not related to the command line pattern you used, but something wrong with the key or account.

The last but not the least, you can always get latest azcopy at aka.ms/azcopy

1
votes

Check SAS token permissions not only for the Destination Blob but for the Source Blob too.

enter image description here

0
votes

Have you tried the CopyFromBlob method? I'm not sure if it works across subscriptions, and I would test it if i had multiple subscriptions. But it is very fast and doesnt do the up and down.

Something like this:

//set the azure container
string sourceContainerName = "mySourceContainer";
string destinationContainerName = "myDestinationContainer";
//azure connection string
string sourceSettingKey = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", "xxxx",
                                            "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
string destinationSettingKey = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", "xxxx",
                                            "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
//setup the container object
CloudStorageAccount sourceStorageAccount = CloudStorageAccount.Parse(sourceSettingKey);
CloudStorageAccount destinationStorageAccount = CloudStorageAccount.Parse(destinationSettingKey);
CloudBlobClient sourceClient = sourceStorageAccount.CreateCloudBlobClient();
CloudBlobClient destinationClient = destinationStorageAccount.CreateCloudBlobClient();
CloudBlobContainer sourceContainer = blobClient.GetContainerReference(sourceContainerName);
CloudBlobContainer destinationContainer = blobClient.GetContainerReference(destinationContainerName);

// Set permissions on the container.
BlobContainerPermissions permissions = new BlobContainerPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Blob;
sourceContainer.SetPermissions(permissions);
destinationContainer.SetPermissions(permissions);

//grab the blob
CloudBlob sourceBlob = sourceContainer.GetBlobReference("mySourceBlobName");
CloudBlob destinationBlob = destinationContainer.GetBlobReference("myDestinationBlobName");
//create a new blob
destinationBlob.CopyFromBlob(sourceBlob);
0
votes

I got a 403 because the first key I used (generated for the storage account) was somehow defective. Simply changing to use the second key fixed the problem - and of course regenerating the first key.

0
votes

AzCopy requires the storage container from which the file is being copied, not the full path to the file.

As such,

AzCopy /Source:https://accountfrom.blob.core.windows.net/neo4j /Dest:https://accountto.blob.core.windows.net/neo4j /SourceKey:key /DestKey:key /Pattern:neo4j-250gb.db.vhd

The /Pattern refers to the file being copied. By default files being copied across azure storage are done server side. See https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy#copy-blobs-in-blob-storage for some further information.

0
votes

When creating first time 'Object' needed to be checked in SAS key generation form in allowed 'resource type'. Default way to debug would be to get a SAS key with all checked and if it works then slim it down.