1
votes

I've read several posts regarding similar queries, like this one, but I keep getting 403.

Initially I wrote code in Visual Studio - azure function accessing a storage blob - and everything runs fine. But when I deploy the very same function, it throws 403! I tried the suggested, moving to x64 etc and removing additional files, but nothing works.

Please note - i have verified several times - the access key is correct and valid.

So, I did all the following

(1) - I wrote a simple Azure function on Portal itself (to rule out the deployment quirks), and voila, same 403!

var storageConnection = "DefaultEndpointsProtocol=https;AccountName=[name];AccountKey=[key1];EndpointSuffix=core.windows.net";
var cloudStorageAccount = CloudStorageAccount.Parse(storageConnection);
var blobClient = cloudStorageAccount.CreateCloudBlobClient();

var sourceContainer = blobClient.GetContainerReference("landing");
CloudBlockBlob blob = container.GetBlockBlobReference("a.xlsx");

using (var inputStream = new MemoryStream())
{
    log.Info($"Current DateTime: {DateTime.Now}");
    log.Info("Starting download of blob...");
    blob.DownloadToStream(inputStream); // <--- 403 thrown here!!
    log.Info("Download Complete!");
}

(2) - I verified the date time by logging it, and its UTC on the function server

(3) - I used Account SAS key, generated on portal, but still gives 403. I had waited for over 30seconds after SAS key generation, to ensure that the SAS key propagates.

var sasUri = "https://[storageAccount].blob.core.windows.net/?sv=2017-11-09&ss=b&srt=sco&sp=rwdlac&se=2019-07-31T13:08:46Z&st=2018-09-01T03:08:46Z&spr=https&sig=Hm6pA7bNEe8zjqVelis2y842rY%2BGZg5CV4KLn288rCg%3D";
StorageCredentials accountSAS = new StorageCredentials(sasUri);
var cloudStorageAccount = new CloudStorageAccount(accountSAS, "[storageAccount]", endpointSuffix: null, useHttps: true);

// rest of the code same as (1)

(4) - I generated the SAS key on the fly in code, but again 403.

static string GetContainerSasUri(CloudBlobContainer container)
{
    //Set the expiry time and permissions for the container.
    //In this case no start time is specified, so the shared access signature becomes valid immediately.
    SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
    sasConstraints.SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5);
    sasConstraints.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(25);
    sasConstraints.Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Add | SharedAccessBlobPermissions.Create;

    //Generate the shared access signature on the container, setting the constraints directly on the signature.
    string sasContainerToken = container.GetSharedAccessSignature(sasConstraints);

    //Return the URI string for the container, including the SAS token.
    return container.Uri + sasContainerToken + "&comp=list&restype=container";
}

and used the above as

var sourceContainer = blobClient.GetContainerReference("landing");
var sasKey = GetContainerSasUri(sourceContainer);
var container = new CloudBlobContainer(new Uri(sasKey));

CloudBlockBlob blob = container.GetBlockBlobReference("a.xlsx"); 

I completely fail to understand why the code works flawlessly when running from visual studio, accessing the storage (not emulator) on cloud, but when same is either deployed or run explicitly on the portal, it fails to run.

What am I missing here?

1
Have you tried to use a binding instead of accessing a blob on your own? How is your function triggered?kamil-mrzyglod
@Kamo - My function currently is being called / tested using Postman or the portal itself, but eventually, it'll be triggered via Data Factorysppc42
Could you provide the full exception with a stack trace?kamil-mrzyglod

1 Answers

1
votes

Since you have excluded many possible causes, the only way I can reproduce your problem is to configure Firewall on Storage Account.

Locally the code works as you may have added your local IP into White List while this step was omitted for Function. On portal, go to Resource Explorer under Platform features. Search outboundIpAddresses and add those(usually four) IPs into Storage Account White List.

If you have added Function IPs but still get 403 error, check location of Storage and Function app. If they live in the same region(like both in Central US), two communicate in an internal way without going through outboundIpAddresses. Workaround I can offer is to create a Storage in different region if Firewall is necessary in your plan. Otherwise just allow all networks to Storage.