In a web application there is a google storage bucket where pages and layouts are stored among with static content. There is a google compute server with a nodejs app that serves pages using layouts and dust.js to generate markup. Problem: Google cloud storage returns old/deprecated files content to google compute instance, but not to same code started locally outside of GCE. Locally started code gets fresh content. Even process or machine restart does not help to solve this. Standard package @google-cloud/storage is used for this project to access the bucket content.
0
votes
1 Answers
0
votes
I was able to solve this problem myself with code like this:
const STORAGE_DOWNLOAD_BASE_URL = "https://storage.googleapis.com";
const bucketName = 'yoursuperbucket
/**
*
* @param {string} bucketName
* @param {string} fileName
* @return {string}
*/
function getUniqueDownloadUrl(bucketName, fileName) {
return [STORAGE_DOWNLOAD_BASE_URL, bucketName, fileName].join('/') + '?no-cache=true&ignore-cache=true&anti-cache=' + (new Date().getTime());
}
return new Promise(function (resolve, reject) {
request({
url: getUniqueDownloadUrl(bucketName, filePath),
method: 'GET',
headers: {
'Cache-Control': 'no-cache'
}
}, function (err, response) {
if (err) {
console.error('Failed load file content: ' + filePath + ' from bucket ' + bucketName + ' - ' + err);
return reject(err);
}
resolve(response.body);
});
});
For buckets with not public access rules this relies on service account bound to your GCE instance