0
votes

I've come up with a strange issue where when a machine learning module is pushed down to an IoT Edge device, the output of this module cannot be consumed by ASA in the cloud. e.g. the ASA input sample returns nothing even though messages are being sent (checked with VS Code etc). I've used this tutorial: https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-machine-learning

When messages are sent using the temp sensor simulator ASA correctly samples and it can be consumed by ASA. https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-simulate-device-linux

The only difference i can see is how the JSON is formed - i wonder if the 'applicationProperties' part causes the issue?

ML JSON sent to IoT Hub: 15/02/2018 2:42:14 PM> Device: [DSVM], Data:[["{\"ambient\": {\"humidity\": 24, \"temperature\": 21.277752659180088}, \"machine\": {\"pressure\": 10.860424874724545, \"temperature\": 107.55261834480434}, \"timeCreated\": \"2018-02-15T03:42:14.140615Z\", \"anomaly\": true}"]]Properties: 'AzureMLResponse': 'OK'

Temp sensor JSON sent to IoTHub: 15/02/2018 2:42:14 PM> Device: [DSVM], Data:[{"machine":{"temperature":107.55261834480434,"pressure":10.860424874724545},"ambient":{"temperature":21.277752659180088,"humidity":24},"timeCreated":"2018-02-15T03:42:14.140615Z"}]

Does anyone have any ideas on what the particular issue could be? Can you strip out the properties AzureMLResponse easily?

Thanks, Com

1
Could you please show the log of the Azure ML container?You can find whether there are some errors in the container log. - Michael Xu - MSFT
Please see the log below (can't see any errors). For further context, the message is routing out of the machine learning module fine, arriving in IoT Hub. I can also route the data into BLOB storage. ASA is just not receiving the data. The only difference i can see is the JSON being structured differently with the AzureMLResponse. - Commio
{"request_id": "e4854ea2-acaf-4a5a-82a1-9f654bb9c83e", "timestamp": "2018-02-16T01:39:09.187904", "logger": "logger_stdout", "message": " 0 1 2 3\n0 22.987607 1.226436 21.335369 25"} Prediction is {"request_id": "e4854ea2-acaf-4a5a-82a1-9f654bb9c83e", "timestamp": "2018-02-16T01:39:09.188280", "logger": "logger_stdout", "message": "Prediction is "} 0 {"request_id": "e4854ea2-acaf-4a5a-82a1-9f654bb9c83e", "timestamp": "2018-02-16T01:39:09.188398", "logger": "logger_stdout", "message": "0"} - Commio
I think it is due to the azureml.api.schema.sampleDefinition used in iot_score.py.But i have not found the source or detail structure information about it.I am trying to involve somebody familiar with this issue to research. If there is any progress, i will let you know. - Michael Xu - MSFT
Thanks Michael. Much appreciated. Have been looking into it further in an attempt to see if its a Message data type issue or something of the sort - still no luck. - Commio

1 Answers

0
votes

I've found the issue to this particular problem.

The line: return [json.dumps(input_json)]

Encodes the json string and returns it to IoT Hub. The module it self or IoT Hub must automatically encode the message as json on the way out. The escape characters etc is a result of double encoding.