The way it seems Google Cloud Functions works is by:
- your modules go inside a
functionsdirectory - that
functionsdirectory then contains apackage.jsonfile to contain the shared dependencies across all your modules - modules can contain many exported functions per module
- google cloud functions and firebase functions have different opinions how to handle exported functions in modules
- for gcp, it seems to work like this: the module is uploaded, and then you specify via the web interface or command line which exported method should be called from the loaded module
- for firebase, it seems to work like this: the listener methods from
firebase-functionsreturn the handler, but also attaches to that handler the trigger meta-data. Thefirebase-toolscli app then requires your code locally, grabs the exported functions, then creates cloud functions for each exported method based on its attached meta-data from the firebase-functions stuff. As such, if you put all your cloud functions in the same module, then it will deploy that module multiple times for each cloud function, and for each cloud function the entire module is loaded, and then the specific exported function is called.
- if you configure a exported function to be a http trigger, it uses an undefined version of express.js, and an amorphous amount and order of bundled middlewares
This is strange as:
- say even if the modules
one.jsandtwo.jsrequire different packages at runtime, the sharedpackage.jsonbetween them means that their startup time will be slower than if done individually as they will both need to install all the dependencies of the package rather than just their own - if you have several exported functions inside
index.js, such ashi()andhello(), then thehicloud function will also have thehello()function loaded in memory despite not using it, as well as thehellocloud function will havehi()in memory despite not using it, as for both the resulting cloud function will still use the sameindex.jsfile, loading everything inside that module into memory even if other parts aren't needed
As such, what is the best practice for making sure your cloud functions run optimally with the lightest runtime footprint possible? As it seems the design decisions by Google mean that the more cloud functions you make, then the more junk gets bundled with each cloud function, slowing them down and costing more.
As an side, it seems to me that this would have been a better approach for google: Each cloud function should have its own directory, and in each directory there is a package.json file and a index.js file. The index.js file then does a module.exports = function(...args){} or a export default function(...args){}.
This way the architecture aligns with how one expects cloud functions to operate - being that a cloud function represents a single function - rather than a cloud function being the installation of the shared dependencies between your all your cloud functions, then the loading of a module that can contain multiple cloud functions but only one is used, then the execution of only one function out of that loaded module.
Funnily enough, Azure Functions seems be designed exactly the way I expect cloud functions to operate: https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node