7
votes

I have a Django project built on Google Cloud Platform. We are using Django's auth system, and most (nearly all) users do not have credentials set up in the GCP project, so all file auth needs to be based on Django and not GCP.

Our backend configuration for file storage is very basic, and files are successfully uploaded to GCS as expected:

DEFAULT_FILE_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage'
GS_BUCKET_NAME = 'my-bucket'

The generated URLs for files are of the form (again, as expected):

https://storage.googleapis.com/my-bucket/my-document.txt

The problem is this bucket cannot be made publicly readable as files have access controls based on rules set up in Django's permission system that are different per user.

How can I have Django serve the file instead of the file being served by GCS?

One thought that comes to mind is to have views that load files from GCS and pass them through to the requesting client, but I suspect this will not handle large files well as I need to either load the entire file into local memory (bad) or load the file in chunks and write them out to the response stream in those chunks, but don't know if this can be done in Django.

1
another possible solution here: andrewbrookins.com/django/…seawolf
How did you end up managing this situation? I have a similar issue stackoverflow.com/questions/63969069/…Prikers
I basically "solved" it as mentioned in the comment under the accepted answer.seawolf

1 Answers

3
votes

If your application is what's determining access to the files (via Django's auth system) then you'll have to do the pass-through method that you're describing.

If the files are large, you can stream the response instead of loading and sending the entire response at once -- see django.http.StreamingHttpResponse.