1
votes

I have been using Firebase Functions so far. In this time, I would like to use Firebase Functions and Cloud Run in the same project. Is there a good way to share some code between Firebase Functions and Cloud Run? Will it work just by creating a symbolic link? Or, do I need to create private npm packages?

My developing environment is Windows 10, node 8. I'm using Firebase-tools to deploy Firebase Functions and intend to use gcloud command to deploy Cloud Run.

--- edit

I examined how to use the private npm package with Google Cloud Source Repositories. It looks a little cumbersome for this small project. Then I found useful information.

Local npm dependency “does not a contain a package.json file” in docker build, but runs fine with npm start https://stackoverflow.com/a/58368933/7908771

I tried the same way below and it seems to work fine.

--- edit 2

Project folder structure:

project/ ................... <- specify project root when `docker build`
├─ .dockerignore ........... 
├─ containers/ ............. 
│  ├─ package.json ......... <- gather docker commands here
│  └─ hello-world/ ......... 
│     ├─ Dockerfile ........ <- COPY modules from functions in this Dockerfile
│     ├─ index.js .......... <- require modules from here
│     ├─ package-lock.json . 
│     └─ package.json ...... 
├─ functions/ .............. 
│  ├─ index.js ............. 
│  ├─ package-lock.json .... 
│  ├─ package.json ......... 
│  ├─ modules/ ............. 
│  │  ├─ cloudStorage/ ..... <- wanted module
│  │  │  ├─ index.js ....... 
│  │  │  ├─ package.json ... 
│  │  │  └─ test.js ........ 

In Dockerfile:

WORKDIR /usr/src/app
COPY functions/modules ../../functions/modules

.dockerignore on project root:

**/.git
./node_modules # <- ignore only root node_modules to COPY node_modules inside wanted modules

Require copied functions modules in js inside containers:

const cloudStorage = require('../../functions/modules/cloudStorage')

Build script:

{
  "scripts": {
    "build:hello-world": "docker build ../ --tag gcr.io/project/hello-world -f hello-world/Dockerfile",
  }
}

In this way, modules can be loaded when testing js outside the container, and can be loaded even inside the container.

Will this cause any problems? Please let me know if there is anything wrong.

Thank you.

1

1 Answers

5
votes

A symlink will not work at all, as each deployment to Cloud Functions or Cloud Run is completely independent from each other. They don't share any filesystems. You will have to deploy the shared code to each product. A private npm package might work. The bottom line is that your package.json will have to describe where to get the code, or the code will have to be part of the deployment.