0
votes

I have a .net core web app that I deploy to azure using Azure Devops build/release pipelines. This project references Business project and a Models project that are part of the three-tier solution. The Models project consists of Entity Framework 6 code first models (including migrations).

Recently I have had to deploy a triggered webjob in order to accomplish a long running task. This was just created as a normal .Net console app and then published from within Visual Studio 2017 by selecting "Publish as Azure Web Job". This webjob is published to and runs under the .net core web app service mentioned above. It references the same Models and Business project that the .net core web app references.

My issue is that whenever the model is changed by introducing db migrations, the web job also must be updated since the models.dll that is published as part of the webjob project resides separately in a directory app_data\jobs\triggered\webjob under the main web app.

Is there any way to configure my webjob so that the models.dll and business.dll are directly referenced from that of the main web app? Failing that, how can I modify the Azure devops process to copy these files to the directory of the webjob upon successful deploy? Is there a guide for this?

1
Just checking in to see if the information provided was helpful. Please let us know if you would like further assistance.Leo Liu-MSFT

1 Answers

0
votes

how can I modify the Azure devops process to copy these files to the directory of the webjob upon successful deploy? Is there a guide for this?

AFAIK, we could use powershell scripts and git command to filter whether the file modification comes from models.dll and business.dll files, like:

$editedFiles = git diff HEAD HEAD~ --name-only
$editedFiles | ForEach-Object {
    Switch -Wildcard ($_ ) {
        'SubFolderA/models.dll*' { Write-Output "##vso[task.setvariable variable=UpdateFile]True" }
        # The rest of your path filters
    }
}

Code comes from here.

Then add custom conditions in the next task in the build pipeline:

and(succeeded(), eq(variables['UpdateFile'], 'True'))

In the next task, we could use the kudu API to update these files to the directory of the webjob upon successful deploy:

GET /api/zip/{path}/
Zip up and download the specified folder. The zip doesn't include the top folder itself. Make sure you include
the trailing slash!

PUT /api/zip/{path}/
Upload a zip file which gets expanded into the specified folder. Existing files are not deleted
unless they need to be overwritten by files in the zip. The path can be nested (e.g. `folder1/folder2`), and needs to exist.

You could check the similar thread and the document for some details.

Hope this helps.