10
votes

Goal

We would like users to be able to upload images to Google Cloud Storage.

Problem

We could achieve this indirectly with our server as a middle man -- first, the user uploads to our server, then our privileged server can upload to Cloud Storage.

However, we think this is unnecessarily slow, and instead would like the user to upload directly to Cloud Storage.

Proposed Solution

To achieve a direct upload, we generate a Signed URL on our server. The Signed URL specifies an expiration time, and can only be used with the HTTP PUT verb. A user can request a Signed URL, and then - for a limited time only - upload an image to the path specified by the Signed URL.

Problem with the Solution

Is there any way to enforce a maximum file upload size? Obviously we would like to avoid users attempting to upload 20GB files when we expect <1MB files.

It seems like this is an obvious vulnerability, yet I don't know how to address it while still using SignedURLs.

There seems to be a way to do this using Policy Documents (Stack Overflow answer), but the question is over 2 years old now.

5

5 Answers

5
votes

Policy documents are still the right answer. They are documented here: https://cloud.google.com/storage/docs/xml-api/post-object#policydocument

The important part of the policy document you'll need is:

["content-length-range", <min_range>, <max_range>].
3
votes

For all looking at the answer today be aware that link

x-goog-content-length-range:0,25000

is the way to limit the upload size between 0 and 25000 bytes Cloud Storage.

X-Upload-Content-Length will not work and you are still able to upload larger files

1
votes

Signing content-length should do the trick.

Google Cloud will not allow uploads with larger file size even if the content-length is set to a lower value.

This is how the signed url options should look like:

const writeOptions: GetSignedUrlConfig = {
  version: 'v4',
  action: 'write',
  expires: Date.now() + 900000, // 15 minutes
  extensionHeaders: {
    "content-length": length // desired length in bytes
  }
}
0
votes

You can use X-Upload-Content-Length instead of Content-Length. See blog post here.

On the server side (Java):

Map<String, String> extensionHeaders = new HashMap<>();
extensionHeaders.put("X-Upload-Content-Length", "" + contentLength);
extensionHeaders.put("Content-Type", "application/octet-stream");

var url =
  storage.signUrl(
    blobInfo,
    15,
    TimeUnit.MINUTES,
    Storage.SignUrlOption.httpMethod(HttpMethod.PUT),
    Storage.SignUrlOption.withExtHeaders(extensionHeaders),
    Storage.SignUrlOption.withV4Signature()
  );

On the client side (typescript):

const response = await fetch(url, {
  method: 'PUT',
  headers: {
    'X-Upload-Content-Length': `${file.size}`,
    'Content-Type': 'application/octet-stream',
  },
  body: file,
});

You will need to set up a cors policy on your bucket:

[
  {
    "origin": ["https://your-website.com"],
    "responseHeader": [
      "Content-Type",
      "Access-Control-Allow-Origin",
      "X-Upload-Content-Length",
      "x-goog-resumable"
    ],
    "method": ["PUT", "OPTIONS"],
    "maxAgeSeconds": 3600
  }
]
0
votes

My worked code in NodeJS was following https://blog.koliseo.com/limit-the-size-of-uploaded-files-with-signed-urls-on-google-cloud-storage/. You must use the version v4

 public async getPreSignedUrlForUpload(
    fileName: string,
    contentType: string,
    size: number,
    bucketName: string = this.configService.get('DEFAULT_BUCKET_NAME'),
  ): Promise<string> {
    const bucket = this.storage.bucket(bucketName);
    const file = bucket.file(fileName);

    const response = await file.getSignedUrl({
      action: 'write',
      contentType,
      extensionHeaders: {
        'X-Upload-Content-Length': size,
      },
      expires: Date.now() + 60 * 1000, // 1 minute
      version: 'v4',
    });

    const signedUrl = this.maskSignedUrl(response[0], bucketName);
    return signedUrl;
  }

In the Frontend, We must set the same number of the Size in the header X-Upload-Content-Length


export async function uploadFileToGCP(
  signedUrl: string,
  file: any
): Promise<any> {
  return new Promise((resolve, reject) => {
    const xhr = new XMLHttpRequest();
    xhr.withCredentials = process.env.NODE_ENV === 'production';

    xhr.addEventListener('readystatechange', function () {
      if (this.readyState === 4) {
        resolve(this.responseText);
      }
    });

    xhr.open('PUT', signedUrl, true);
    xhr.setRequestHeader('Content-Type', file.type);
    xhr.setRequestHeader('X-Upload-Content-Length', file.size);

    xhr.send(file);
  });
}

And also don't forget to config the responseHeader in the GS CORS

gsutil cors get gs://asia-item-images
[{"maxAgeSeconds": 3600, "method": ["GET", "OPTIONS", "PUT"], "origin": ["*"], "responseHeader": ["Content-Type", "Access-Control-Allow-Origin", "X-Upload-Content-Length", "X-Goog-Resumable"]}]