3
votes

I have a PowerShell script that downloads a JSON file stored in an Azure storage account blob. This file is in UTF-8 encoding. The script then reads from the JSON, makes the changes, creates a new JSON file with the same name, and uploads that back to the storage account using Set-AzureStorageBlobContent cmdlet. However, all the applications that were using that JSON file stopped working. After hours of troubleshooting, I noticed that when it uploads the new JSON back to the storage container (replacing the existing one), it uploads the JSON in a UTF-16 encoding.

Is there a parameter in Set-AzureStorageBlobContent cmdlet where I can specify the encoding? I looked into the official documentation but couldn't find an answer.

Before I upload the new JSON in, all the values are stored in a variable and I actually use the cmdlet ConvertTo-Json to generate the new JSON file. Is there a parameter in ConvertTo-Json to specify the encoding type?

Right now, all I use to upload the file is:

$jsonContent | ConvertTo-Json -Depth 4 | Out-File C:\P3\myFile.json

Set-AzureStorageBlobContent -Context $storageContext -Container "myContainer" -File "myFile.JSON"  -Force

Please advise!

2

2 Answers

2
votes

Of course you can, try the command below.

Set-AzureStorageBlobContent -Context $context -Container "111" -File "C:\Users\joyw\Desktop\testjson.json" -Properties @{"ContentEncoding" = "UTF-8"} -Force

enter image description here

Catch the request of the powershell, you will find x-ms-blob-content-encoding: UTF-8.

enter image description here

2
votes

Figured out the solution:

$JSONConvert = $jsonContent | ConvertTo-Json -Depth 4
$JSONEncode = [System.Text.UTF8Encoding]::new($false) 
[System.IO.File]::WriteAllLines('C:\P3\myFile.JSON',$JSONConvert ,$JSONEncode)

Set-AzureStorageBlobContent -Context $storageContext -Container "myContainer" -File "myFile.JSON" -Properties @{"ContentEncoding" = "UTF-8"} -Force

This will upload a UTF-8 encoded JSON file to the blob.