4
votes


now in Linux VM, I upload a single file with this command:
azure storage blob upload -q /folder/file.txt --container containerName

Is possible upload more file at the same time? (with a single command)

3

3 Answers

6
votes

You can use a loop like so

#!/bin/bash

export AZURE_STORAGE_ACCOUNT='your_account'
export AZURE_STORAGE_ACCESS_KEY='your_access_key'

export container_name='name_of_the_container_to_create'
export source_folder=~/path_to_local_file_to_upload/*


echo "Creating the container..."
azure storage container create $container_name

for f in $source_folder
do
  echo "Uploading $f file..."
  azure storage blob upload $f $container_name $(basename $f)
  cat $f
done

echo "Listing the blobs..."
azure storage blob list $container_name

echo "Done"
1
votes

The command line does not have an option to bulk upload multiple files in one invocation. However, you can either use find or a loop to upload multiple files, or if doing this from Windows is an option, then you can look at using the AzCopy tool (http://aka.ms/azcopy).

0
votes

If you have access to a recent Python interpreter on your Linux VM and all of your files are in one directory then the Azure Batch and HPC team has released a code sample with some AzCopy-like functionality on Python called blobxfer that may help with your situation. The script permits full recursive directory ingress into Azure Storage as well as full container copy back out to local storage. [full disclosure: I'm a contributor for this code]