I'd like to be able to serve up SAS URIs to clients so they may upload large blobs to Azure Blob Storage. As far as I can tell, there's no way to do this while limiting the size of blobs being uploaded. I've thought about introducing an intermediary proxy service that inspects the size of the payload before uploading it to storage (and thus giving up on the SAS URI approach), but I feel like there must be another way. Thoughts?
1
votes
what is the limit you want to set up ?
– Thomas
@Thomas For instance, a file size limit of 3GB.
– kylemart
I think a proxy service is an overkill for big files as you will have to re implement all the good stuff provided by the storage account. every client can upload up to 3GB of data or each file has to be less than 3GB ? I am curisou sorry, what is your requirement about this size limit
– Thomas
@Thomas The idea is that each blob (i.e. file) would be limited to 3GB
– kylemart
yeah but why ? people can upload as many file as they want ?
– Thomas
1 Answers
1
votes
Per my investigation , sas uri could control permission ,period, access type, IP address etc. However, it could not limit the size of individual upload blob. Please check this feedback.
I think you could create individual container for the application to limit the total size of storage space. You could check the limitation doc.Or you could implement this limit in your application.
Hope it helps you.