I can see there is a task to upload local files to Azure storage or VM. But how can we download from blob or file share into the pipeline agent? Currently I am using azcopy with a SAS URI. Is there a task in Pipelines that will do this using a service connection instead?
3 Answers
So since I am downloading files from Azure Storage Share (Not blob or container) azcopy works out to be a hell of a lot faster. Using az storage file download-batch
was way too slow. Instead we can just use bash to call azcopy.
Install azcopy in pipeline agent
- task: Bash@3
displayName: Install azcopy
inputs:
targetType: 'inline'
script: |
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
mkdir $(Agent.ToolsDirectory)/azcopy
wget -O $(Agent.ToolsDirectory)/azcopy/azcopy_v10.tar.gz https://aka.ms/downloadazcopy-v10-linux
tar -xf $(Agent.ToolsDirectory)/azcopy/azcopy_v10.tar.gz -C $(Agent.ToolsDirectory)/azcopy --strip-components=1
Download with azcopy using az-cli task
- task: AzureCLI@2
displayName: Download using azcopy
inputs:
azureSubscription: 'Service-Connection'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
end=`date -u -d "180 minutes" '+%Y-%m-%dT%H:%M:00Z'`
sas=$(az storage share generate-sas -n share-name --account-name account-name --https-only --permissions lr --expiry $end -o tsv)
$(Agent.ToolsDirectory)/azcopy/azcopy copy "https://account-name.file.core.windows.net/share-name/folder/?$sas" "/Download-Path" --recursive --check-md5=FailIfDifferent
As far as I know, we don't have such task can direct satisfied your requirement. But you can utilize the Azure CLi task to execute the command.
As an example, I can execute az storage blob download command in Azure cli to download files from Azure Blob Storage:
steps:
- task: AzureCLI@1
displayName: 'Azure CLI '
inputs:
azureSubscription: {service connection}
scriptLocation: inlineScript
inlineScript: |
mkdir $(Build.SourcesDirectory)/BlobFile
az storage blob download --container-name $(containername) --file $(Build.SourcesDirectory)/BlobFile --name "{file name}" --account-key $(accountkey) --account-name $(accountname)
The logic of my suggestion is using mkdir
to create a folder in current directory, then download file from blob and save it into this folder. You can follow this to execute your azcopy
command.
We have integrated the service connection into this task, so you can configure the service connection to connect to your Azure blob. Then select it in this Azure cli task.
For me the answer by @philthy didn't work because I ran into this issue in the Azure/azure-storage-azcopy
GitHub repo.
The following did work for me.
- task: Bash@3
displayName: Install azcopy
inputs:
targetType: 'inline'
script: |
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
mkdir $(Agent.ToolsDirectory)/azcopy && cd "$_"
wget -O azcopy_v10.tar.gz https://aka.ms/downloadazcopy-v10-linux
tar -xf azcopy_v10.tar.gz --strip-components=1
- task: AzureCLI@2
displayName: Download using azcopy
inputs:
azureSubscription: my-vmssagents-service-connection
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
export STORE_NAME="data"
export CONTAINER_NAME="data"
export FOLDER="my_folder"
NOW=`date +"%Y-%m-%dT%H:%M:00Z"` \
EXPIRY=`date -d "$NOW + 1 day" +"%Y-%m-%dT%H:%M:00Z"` \
&& export SAS_TOKEN=$( az storage container generate-sas \
--account-name $STORE_NAME \
--name $CONTAINER_NAME \
--start $NOW \
--expiry $EXPIRY \
--permissions acdlrw \
--output tsv )
$(Agent.ToolsDirectory)/azcopy/azcopy copy \
"https://${STORE_NAME}.blob.core.windows.net/${CONTAINER_NAME}/${FOLDER}/?${SAS_TOKEN}" \
"." --recursive --include-pattern "*c_*b.nc;left.nc;right.nc" # <-- my specific pattern