I have a number of files that I transferred into Azure Blob Storage via the Azure Data Factory. Unfortunately, this tool doesn't appear to set the Content-MD5 value for any of the values, so when I pull that value from the Blob Storage API, it's empty.
I'm aiming to transfer these files out of Azure Blob Storage and into Google Storage. The documentation I'm seeing for Google's Storagetransfer service at https://cloud.google.com/storage/transfer/reference/rest/v1/TransferSpec#HttpData indicates that I can easily initiate such a transfer if I supply a list of the files with their URL, length in bytes and an MD5 hash of each.
Well, I can easily pull the first two from Azure Storage, but the third doesn't appear to automatically get populated by Azure Storage, nor can I find any way to get it to do so.
Unfortunately, my other options look limited. In the possibilities so far:
- Download file to local machine, determine the hash and update the Blob MD5 value
- See if I can't write an Azure Functions app in the same region that can calculate the hash value and write it to the blob for each in the container
- Use an Amazon S3 egress from Data Factory and then use Google's support for importing from S3 to pull it from there, per https://cloud.google.com/storage/transfer/reference/rest/v1/TransferSpec#AwsS3Data but this really seems like a waste of bandwidth (and I'd have to set up an Amazon account).
Ideally, I want to be able to write a script, hit go and leave it alone. I don't have the fastest download rate from Azure, so #1 would be less than desireable as it'd take a long time.
Have any other approaches?