0
votes

I have a c# console application that runs on a node in pool in Azure Batch. The console app writes to stdout.txt on the node and also creates a copy of this file in the Azure storage account linked to the batch account of the pool this all works fine.

In order to achieve this I followed the web-page: https://docs.microsoft.com/en-us/azure/batch/batch-task-output-file-conventions.

If I create and run a pool job called 'jobid001' with a linked task called 'Task001' (the task runs the console app) then a container called 'job-jobid001' is created and what looks like a sub-folder called 'Task001' - this is all correctly created under the storage-account linked to the pool batch-account as shown below...

azure-batch-job

The c# code below creates the job container in the Az storage account and sets up for the contents of the stdout.txt on the batch-node to be written (asynch) to the file on the matching file on Az storage account - this all works fine.

   // Parse the connection string and return a reference to the storage account.

                    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(account_url);

                    // Create a folder in Azure for the standard ouput log files to be copied to
                    string containerName = "job-" + _jobId;
                    containerName = containerName.ToLower();

                    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
                    CloudBlobContainer container = blobClient.GetContainerReference(containerName);
                    container.CreateIfNotExists();

                    TaskOutputStorage taskOutputStorage = new TaskOutputStorage(storageAccount, _jobId, _taskId);

                    TimeSpan stdoutFlushDelay = TimeSpan.FromSeconds(3);

                    // The primary task logic is wrapped in a using statement that sends updates to
                    // the stdout.txt blob in Storage every 15 seconds while the task code runs.
                    using (ITrackedSaveOperation stdout =
                            await taskOutputStorage.SaveTrackedAsync(
                            TaskOutputKind.TaskLog,
                            logFilePath,
                            "stdout.txt",
                            TimeSpan.FromSeconds(15)))
                    {

                        // MAIN C# CONSOLE CODE THAT DOES WORK HAS BEEN REMOVED 

                        // We are tracking the disk file to save our standard output, but the
                        // node agent may take up to 3 seconds to flush the stdout stream to
                        // disk. So give the file a moment to catch up.
                        await Task.Delay(stdoutFlushDelay);
                    }

What I want to achieve is to simulate a folder structure in the Az storage account under a root container 'MyAppLogFiles'. The root container will have a simulated sub-folder structure of: year, month, day and then jobid - how do I achieve this by enhancing the c# code above ? (with the year, month, day etc is the run date-time of the job)

Az-Container

-> MyAppLogFiles
           ->2020
             ->05
              ->31
                ->job-jobid001
                  ->Task001
                    stdout.txt 
1
Hello Ian, have you solved your issue?Ivan Yang

1 Answers

1
votes

I just went through the source code of Microsoft.Azure.Batch.Conventions.Files(but did not give it a try), you may try the steps below:

1.In your code, set the container name to MyAppLogFiles.

2.In this line of code: TaskOutputStorage taskOutputStorage = new TaskOutputStorage(storageAccount, _jobId, _taskId);, you can pass this variable "2020/05/31/job-jobid001/Task001" for _taskId. The referenced source code is here.