Designing an Azure application where remote clients will be 'streaming' data/images at a frequency of ~1 write/sec. Data will go to table storage, and images will go to blob storage.
I may want to run logic before these writes are accepted. For instance, limiting the write frequency or validating the data in the case of a bug or tampering- or other supporting operations like thumbnails, service bus use or anything.
One option is to pipe all operations through a REST service running on a worker role. This service would push data out to storage, and perform needed operations. However, given that clients can access storage services directly (shared access signatures securing access), this seems like an unnecessary bottleneck though more can be spun up. Further, having a role increases costs if there is an opportunity to push this logic somewhere else.
Thanks