5
votes

Summary

I'm creating a CI/CD provisioning pipeline for a new Azure Storage Account within an Azure DevOps Pipeline and attempting to upload some files into the Blob Storage using AzCopy running from an Azure Powershell task in the pipeline.

The Error

The script runs successfully from my local machine but when running in the Azure DevOps pipeline I get the following error (ErrorDateTime is just an obfuscated ISO 8601 formatted datetime):

Assumptions

  • The storage account has been setup to only allow specific VNet and IP Addresses access.
  • It looks like the firewall or credentials are somehow configured wrongly but the ServicePrincipal running the script has been used successfully in other sibling pipeline tasks and to understand these problems i've temporarily given the ServicePrincipal Subscription Owner permissions and the Storage account Firewall Rules tab has "Allow trusted Microsoft Services to access this storage account"

What I've tried...

  • I've successfully run the script from my local machine with my IP Address being in the allowed list.

  • If I enable "Allow access from All networks" on the Storage Account Firewall rules then the script runs and the file is uploaded successfully.

  • It appears as if the Azure Pipeline Agents running in their own VNet don't have access to my Storage Account but I would have thought that requirement would be satisfied by setting "Allow trusted Microsoft Services to access this storage account" in the Firewall settings

I'm using the following line within the Azure Powershell Task. I'm happy with the values because everything works when "All networks" or my IP Address is enabled and I run locally.

.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y

Any thoughts or guidance would be appreciated.

Thanks,

SJB

3

3 Answers

5
votes

After doing further research I noticed the following raised issue - that Azure DevOps isn't considered a trusted Microsoft Service from a Storage Account perspective.

My temporary workaround is to:

  • Setting the DefaultAction to Allow, thereby allowing "All networks access".
  • Setting the DefaultAction to Deny after the copy action ensured my VNet rules were being enforced again.
Try
{
    Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Allow
    .\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
}
Catch
{
    #Handle errors...
}
Finally
{
    Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Deny
}

Thanks,

SJB

5
votes

People seem to be getting mixed results in this github issue, but the AzureFileCopy@4 task works (at least for us) after adding the "Storage Blob Data Contributor" role to the ARM connection's service principal. The below is the only necessary step in a pipeline that deploys a repo as a static website in a blob container:

- task: AzureFileCopy@4
  displayName: 'Copy files to blob storage: $(storageName)'
  inputs:
    SourcePath: '$(build.sourcesDirectory)'
    Destination: AzureBlob
    storage: $(storageName)
    ContainerName: $web
    azureSubscription: 'ARM Connection goes here' # needs a role assignment before it'll work

(Of course, if you're using Azure CDN like we are, the next step is to clear the CDN endpoint's cache, but that has nothing to do with the blob storage error)

-2
votes

Have you considered using the Azure DevOps Task "Azure File Copy" instead of a powershell script? see: https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy?view=azure-devops