1
votes

I have a very basic understanding of Azure WebJob, that is, it can perform tasks in background. I want to upload files to Azure Blob Storage, specifically using Azure WebJob. I would like to know how to do this, from scratch. Assume that the file to be uploaded is locally available on the system in a certain folder (say C:/Users/Abc/Alpha/Beta/).

  1. How and where do I define the background task that is supposed to be performed?
  2. How to make sure, that whenever a new file is available in the same folder ( C:/Users/Abc/Alpha/Beta/) the function is automatically triggered, and this new file is also transferred to Azure Blob Storage?
  3. Can I monitor progress of transfer for each file? or for all files?
  4. How to handle connection failures during transfer? and what other errors should I worry about?
1
I would recommend to read some tutorials on this topic first. If you have specific questions come back and provide some sample code of your problemRomano Zumbé
Thanks! I am trying to do the same. If you have any resources on this, please share.Ayush Soni

1 Answers

0
votes

How and where do I define the background task that is supposed to be performed?

According to your description, you could create a webjob console application in the VS.You could run this console application in the local.

More details, you could refer to this article to know how to create webjob in the VS.

Notice:Sine you need watch the local side folder, this webjob is running in your local side not upload to the azure web app.

How to make sure, that whenever a new file is available in the same folder ( C:/Users/Abc/Alpha/Beta/) the function is automatically triggered, and this new file is also transferred to Azure Blob Storage?

As far as I know, webjob support the filetrigger, it will monitor for file additions/changes to a particular directory, and triggers a job function when they occur.

More details, you could refer to below code sample:

Program.cs:

        static void Main()
        {
            var config = new JobHostConfiguration();
            FilesConfiguration filesConfig = new FilesConfiguration();
            //set the root path when the function to watch the folder
            filesConfig.RootPath = @"D:\";

            config.UseFiles(filesConfig);
            var host = new JobHost(config);
            // The following code ensures that the WebJob will be running continuously
            host.RunAndBlock();
        }

function.cs:

 public static void ImportFile(
              [FileTrigger(@"fileupload\{name}", "*.*", WatcherChangeTypes.Created | WatcherChangeTypes.Changed)] Stream file,
              FileSystemEventArgs fileTrigger,
               [Blob("textblobs/{name}", FileAccess.Write)] Stream blobOutput,
              TextWriter log)
        {
            log.WriteLine(string.Format("Processed input file '{0}'!", fileTrigger.Name));

            file.CopyTo(blobOutput);

            log.WriteLine("Upload File Complete");

        }

Can I monitor progress of transfer for each file? or for all files?

As far as I know, there's a [BlobInput] attribute that lets you specify a container to listen on, and it includes an efficient blob listener that will dispatch to the method when new blobs are detected. More details, you could refer to this article.

How to handle connection failures during transfer? and what other errors should I worry about?

You could use try catch to catch the error.If the error happens you could send the details to the queue or write a txt file in the blob. Then you could do some operation according to the queue message or the blob txt file.