1
votes

I'm trying to read the azure blob content with an azure function.

var blobName = 'my-awesome-text-blob';
blobService.getBlobToText(
    containerName,
    blobName,
    function(err, blobContent, blob) {
        if (err) {
            console.error("Couldn't download blob %s", blobName);
            console.error(err);
        } else {
            console.log("Sucessfully downloaded blob %s", blobName);
            console.log(blobContent);
        }
    });

Container name is always the same and blob name is passed by a queue message that trigger the function.

When I run this the function times out (exceeds 5 minutes).

The queue message with the blob names is correct and displayed, the blob contains just a long json and it is approximately 292kb.

I've tried to trigger the function directly when a new blob is created but it return an object with stream, do you know any method to make that stream readable?

module.exports = async function (context, myBlob) {
    context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
    context.log("type of the blob passed:",typeof(myBlob))

with the code above I get the type of blob and the lenght correctly.

but if I try to print it I willl get:

( 2020-09-22 [Information] printing blob <Buffer 7b ....... )

Thank you for your help.

1

1 Answers

1
votes

Just convert the stream:

module.exports = async function (context, myBlob) {
    context.log(context.bindings.myBlob.toString());
    context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
};