8
votes

When I try to create a production deployment of a cloud service using the Azure portal and attempt to upload a package (a .cspkg file) I get the error message:

BlobStore SAS uri command execution failed. Details: Status Code =0, StatusText = none.

And I get the same message when I attempt to upload a configuration (a .cscfg file).

The .cspkg and .cscfg files were both built by packaging a (tested) project in Visual Studio.

I only have one storage account and it's name is correctly defaulting into the "Upload a Package" wizard in the Azure portal. So I don't think I can specify the storage resource it any other way.

Any idea what I can do or look at ?

thanks

4

4 Answers

6
votes

I faced same issue in new azure portal. Switched back to old for uploading package and cloud config and it worked.

1
votes

This same problem happens to me from time to time on a project that otherwise has no problem being deployed to azure. Restarting Visual Studio and rebuilding the azure project usually removes the problem during upload.

1
votes

I had the same issue. I got contributor rights to one of our company Azure accounts for cloud services. I got also all connection strings for storage and service bus etc. But i did not get any other rights in the portal. So the storage accounts did not show up when i selected this directory in the portal.

When i uploaded the packages then azure requires a storage account from the current subscription and it must be visible there - it is not possible to just enter a connection string.

After i got the rights to one storage account (and selected it), i was finally able to upload packages just fine.

It still does not work from withing visual studio - as i understood co-admin rights for the subscription would be needed.

1
votes

I've got the same issue and I've just spent some time trying to at least make it work.

It turned out that when I was creating a storage account (used for my deployment) in the new portal, I kept the default "Resource manager" for the "Deployment model" parameter.

I created another storage account selecting "Classic" for the parameter this time. And pointed my deployment upload to this "classic" storage. It helped! :)

I am still figuring out what's just happened, but it looks like you should use a "classic" storage account because you are using a "classic" cloud service.

In "All resources", the storage accounts with different deployment models will look differently: the classic one will look blue, the "resource manager" one will look greenish. We need the blue one :)