Situation
I'm using Azure Logic App to do the following steps:
- Load/read a JSON file from blob storage. This file is created and written in R, using the toJSON function fromt JSONlite package
- Get de file content from this JSON file
- Convert it to a JSON format (using expression json())
- Insert the converted output into a Azure Table Storage
Please look at the screenshot for this workflow
I've created a JSON-file in R (using the JSONlite package). When printed in the R console, it looks fine:
{"PartitionKey":"pdf","RowKey":"robject)test","nr":"24330087","status":true}
This is the 'formatting' I want. When in this format, it inserts smoothly with logic app into the azure table storages.
Problem
But when I write the JSON-file above to Azure Blob, and use this file as input in a logic app workflow, these escape slashes cause problems
The file from the blob storage isn't interpreted by logic app in the desired format (without escape slashed) It includes the escape slashes (). And I think this causes the problems in logic apps. I pasted a hardcoded format without these slashes. And this worked. But hard coding isn't an option here.
Below the format the logic app 'json()' expression makes of it. It includes the dreaded escapes.
[
"{\"PartitionKey\":\"pdf\",\"RowKey\":\"coolblue_factuur_1744212700.pdf\",\"kvknr\":\"24330087\",\"active_status\":true}"
]
And then this Error occurs:
{
"odata.error": {
"code": "InvalidInput",
"message": {
"lang": "en-US",
"value": "An error occurred while processing this request.\nRequestId:xxxx\nTime:2019-11-20T09:02:46.6051827Z"
}
}
}
After some online research, it looks like Logic App is having difficulty with the escape slashes () and the dubble quotes it escapes ("). So all these (\") escapes are choking the flow.
Question
How to deal with \" (escape slashes?). All I want is that logic app can correctly read the JSON-file from blob storage, convert it and instert the data into the table storage. So ideally: convert the JSON file from blob storage into a format without escape slashes. I tried to pre process this in R, but this did not work.
Extra info
Below you can find the steps I took in logic app:
The JSON file uploaded (and fetched) from blob storage
{
"headers": {
"Pragma": "no-cache",
"Transfer-Encoding": "chunked",
"Retry-After": "15",
"Vary": "Accept-Encoding",
"x-ms-request-id": "xxxx",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"X-Content-Type-Options": "nosniff",
"X-Frame-Options": "DENY",
"Timing-Allow-Origin": "*",
"x-ms-apihub-cached-response": "true",
"Cache-Control": "no-store, no-cache",
"Date": "Wed, 20 Nov 2019 09:09:52 GMT",
"Location": "https://[location url]",
"Set-Cookie": "[cookieset]",
"Content-Type": "application/json",
"Expires": "-1",
"Content-Length": "452"
},
"body": {
"Id": "xyz",
"Name": "robjecttest_parameters_db.json",
"DisplayName": "robjecttest_parameters_db.json",
"Path": "/path/robjecttest_parameters_db.json",
"LastModified": "2019-11-20T09:09:39Z",
"Size": 95,
"MediaType": "application/octet-stream",
"IsFolder": false,
"ETag": "\"[etag]\"",
"FileLocator": "[filelocations]",
"LastModifiedBy": null
}
}
From the headers, the content type is 'JSON'. But the body (which is the data I want) it is 'octet-stream').
After a 'get filecontent' action:
{
"$content-type": "application/octet-stream",
"$content": "[content]"
}
Next step is to convert/cast the 'body-data' to a JSON format. Using the expression & dynamic content from logic APP
json(body('Get_blob_content'))
Use this 'output' as an entity to insert into the table storage.