I'm new to GCP, Cloud Functions and NodeJS ecosystem. Any pointers would be very helpful.
I want to write a GCP Cloud Function that does following:
- Read contents of file (sample.txt) saved in Google Cloud Storage.
- Copy it to local file system (or just console.log() it)
- Run this code using functions-emulator locally for testing
Result: 500 INTERNAL error with message 'function crashed'
. Function logs give following message
2019-01-21T20:24:45.647Z - info: User function triggered, starting execution
2019-01-21T20:24:46.066Z - info: Execution took 861 ms, finished with status: 'crash'
Below is my code, picked mostly from GCP NodeJS sample code and documentation.
exports.list_files = (req, res) => {
const fs = require('fs');
const {Storage} = require('@google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('curl-tests');
bucket.setUserProject("cf-nodejs");
const file = bucket.file('sample.txt'); // file has couple of lines of text
const localFilename = '/Users/<username>/sample_copy.txt';
file.createReadStream()
.on('error', function (err) { })
.on('response', function (response) {
// Server connected and responded with the specified status and
headers.
})
.on('end', function () {
// The file is fully downloaded.
})
.pipe(fs.createWriteStream(localFilename));
}
I run like this:
functions call list_files --trigger-http
ExecutionId: 4a722196-d94d-43c8-9151-498a9bb26997
Error: { error:
{ code: 500,
status: 'INTERNAL',
message: 'function crashed',
errors: [ 'socket hang up' ] } }
Eventually, I want to have certificates and keys saved in Storage buckets and use them to authenticate with a service outside of GCP. This is the bigger problem I'm trying to solve. But for now, focusing on resolving the crash.