0
votes

Intro

My scenario is that I want to re-use shared code from a repo in Azure DevOps across multiple projects. I've built a pipeline that produces a wheel as an artifact so I can download it to other pipelines.

The situation

Currently I have succesfully setup a pipeline that deploys the Python Function App. The app is running fine and stable. I use SCM_DO_BUILD_DURING_DEPLOYMENT=1 and ENABLE_ORYX_BUILD=1 to achieve this.

I am now in the position that I want to use the artifact (Python/pip wheel) as mentioned in the intro. I've added a step in the pipeline and I am able to download the artifact successfully. The next step is ensuring that the artifact is installed during my Python Function App Zip Deployment. And that is where I am stuck at.

The structure of my zip looks like:

__app__
 | - MyFirstFunction
 | | - __init__.py
 | | - function.json
 | | - example.py
 | - MySecondFunction
 | | - __init__.py
 | | - function.json
 | - wheels
 | | - my_wheel-20201014.10-py3-none-any.whl                            <<< this is my wheel
 | - host.json
 | - requirements.txt 

The problem

I've tried to add commands like POST_BUILD_COMMAND and PRE_BUILD_COMMAND to get pip install the wheel but it seems the package is not found (by Oryx/Kudu) when I use the command: -POST_BUILD_COMMAND "python -m pip install --find-links=home/site/wwwroot/wheels my_wheel" Azure DevOps does not throw any exception or error message. Just when I execute the function I get an exception saying: Failure Exception: ModuleNotFoundError: No module named 'my_wheel'.

My question is how can I change my solution to make sure the build is able to install my_wheel correctly.

Sidenote: Unfortunately I am not able to use the Artifacts feed from Azure DevOps to publish my_wheel and let pip consume that feed.

2
You could go to Kudu console to check if this wheel is indeed installed. Or trying to run pip install command to install this wheel before deploying application. BTW, you could consider using this pre-defined feature: Check out multiple repositories in your pipeline for this SharedCode repositorygigatt
@Wesley Does Doris's solution works for you? Or have you resolved this issue with other solutions, could you please share your solution, which will help others who encounter this issue in the future?Edward Han-MSFT
Hi @EdwardHan-MSFT. I haven't found a working solution yet, but I am still trying. Currently I am trying the solution as [@gigatt] suggested. I removed all the pre and post build commands and see which variables/configurations I need to get the function running. Still stuck on the ModuleNotFoundError when I configure the app with ENABLE_ORYX_BUILD true and SCM_DO_BUILD_DURING_DEPLOYMENT true. The module not found error is now related to the first module that is found. So it seems the deployment doesn't run a pip install requirements command.Wesley
Thanks for your reply. Hope gigatt's solution works. You could try to add a separate Command Line task to run pip install requirements command.Edward Han-MSFT

2 Answers

0
votes

Here is how my custom wheel works in VS code locally: enter image description here

Navigate to your DevOps, edit pipeline YAML file, add a python script to specify the wheel file to be installed:

pip install my_wheel-20201014.10-py3-none-any.whl

Like this:

enter image description here

Enable App service Log and navigate to Log Stream to see if it works on Azure:

enter image description here

0
votes

I have solved my issue by checking out the repository of my shared code and included the shared code in the function app package.

Also I replaced the task AzureFunctionApp@1 with the AzureCLI@2 task and deploy the function app with a az functionapp deployment source config-zip command. I set the application settings via a separate AzureAppServiceSettings@1 step in the pipeline.

AzureCLI@2: AzureCLI@2 task

It is not the exact way I wanted to solve this because I still have to include the requirements of the shared code in the root requirements.txt as well.

Switching the task AzureFunctionApp@1 to the AzureCLI@2 gives me more feedback in the pipeline. The result should be the same.