1
votes

Situation

I'm using Azure Logic App to do the following steps:

  1. Load/read a JSON file from blob storage. This file is created and written in R, using the toJSON function fromt JSONlite package
  2. Get de file content from this JSON file
  3. Convert it to a JSON format (using expression json())
  4. Insert the converted output into a Azure Table Storage

Please look at the screenshot for this workflow

azure workflow

I've created a JSON-file in R (using the JSONlite package). When printed in the R console, it looks fine:

{"PartitionKey":"pdf","RowKey":"robject)test","nr":"24330087","status":true}

This is the 'formatting' I want. When in this format, it inserts smoothly with logic app into the azure table storages.

Problem

But when I write the JSON-file above to Azure Blob, and use this file as input in a logic app workflow, these escape slashes cause problems

The file from the blob storage isn't interpreted by logic app in the desired format (without escape slashed) It includes the escape slashes (). And I think this causes the problems in logic apps. I pasted a hardcoded format without these slashes. And this worked. But hard coding isn't an option here.

Below the format the logic app 'json()' expression makes of it. It includes the dreaded escapes.

[
  "{\"PartitionKey\":\"pdf\",\"RowKey\":\"coolblue_factuur_1744212700.pdf\",\"kvknr\":\"24330087\",\"active_status\":true}"
]

And then this Error occurs:

{
  "odata.error": {
    "code": "InvalidInput",
    "message": {
  "lang": "en-US",
  "value": "An error occurred while processing this request.\nRequestId:xxxx\nTime:2019-11-20T09:02:46.6051827Z"
}
  }
}

After some online research, it looks like Logic App is having difficulty with the escape slashes () and the dubble quotes it escapes ("). So all these (\") escapes are choking the flow.

Question

How to deal with \" (escape slashes?). All I want is that logic app can correctly read the JSON-file from blob storage, convert it and instert the data into the table storage. So ideally: convert the JSON file from blob storage into a format without escape slashes. I tried to pre process this in R, but this did not work.

Extra info

Below you can find the steps I took in logic app:

The JSON file uploaded (and fetched) from blob storage

{
"headers": {
    "Pragma": "no-cache",
    "Transfer-Encoding": "chunked",
    "Retry-After": "15",
    "Vary": "Accept-Encoding",
    "x-ms-request-id": "xxxx",
    "Strict-Transport-Security": "max-age=31536000; includeSubDomains",
    "X-Content-Type-Options": "nosniff",
    "X-Frame-Options": "DENY",
    "Timing-Allow-Origin": "*",
    "x-ms-apihub-cached-response": "true",
    "Cache-Control": "no-store, no-cache",
    "Date": "Wed, 20 Nov 2019 09:09:52 GMT",
    "Location": "https://[location url]",
    "Set-Cookie": "[cookieset]",
    "Content-Type": "application/json",
    "Expires": "-1",
    "Content-Length": "452"
},
"body": {
    "Id": "xyz",
    "Name": "robjecttest_parameters_db.json",
    "DisplayName": "robjecttest_parameters_db.json",
    "Path": "/path/robjecttest_parameters_db.json",
    "LastModified": "2019-11-20T09:09:39Z",
    "Size": 95,
    "MediaType": "application/octet-stream",
    "IsFolder": false,
    "ETag": "\"[etag]\"",
    "FileLocator": "[filelocations]",
    "LastModifiedBy": null
}
}

From the headers, the content type is 'JSON'. But the body (which is the data I want) it is 'octet-stream').

After a 'get filecontent' action:

{
  "$content-type": "application/octet-stream",
  "$content": "[content]"
}

Next step is to convert/cast the 'body-data' to a JSON format. Using the expression & dynamic content from logic APP

json(body('Get_blob_content'))

Use this 'output' as an entity to insert into the table storage.

1
Hi, may I know if the solution I provided solved your problem ? If still have problem, please feel free to let me know~Hury Shen
Hi Hury, so far it did not work. I'm trying a more elegant solution. By using a HTTP Request trigger. This almost words, but fails in the final database insertion. It says 'invalid input'. But the input seems to be correct JSON format. Please see belowrgms
{ "PartitionKey": "pdf", "RowKey": "filename.pdf", "kvk_reg_date": "2019-10-14T00:00:00", "kvk_found_date": "2019-09-08T00:00:00", "is_company_active": true, "kvk_time_stamp": "2019-11-21T09:23:47", "kvk_nr_database": "12345678", "kvk_rsin": "12345678", "invoice_kvk_nr": { "kvk_nr": "12345678", "kvk_direct": true }, "postalcode_4digits_search_vector": "1234", "company_names_vector": "example names" }rgms
Hi, I test it just now and update the answer below. If remove the data in red box, it will not show error message.Hury Shen

1 Answers

1
votes

You just need to replace the \" with " by the expression below:

replace(string(body('Get_blob_content')), '\"', '"')

And then we can use json() to convert it.

Update:

In your json data to insert to table storage, the data in red box below cause the problem enter image description here

You can't insert into it with the second level of data.