0
votes

I configured an Azure Stream Analytics job in the following way:

Input: IoT Hub (4 partitions)

Output: Blob Storage

Stream Analytics: it forwards the input data to the output channel with no data transformation (1 partition).

To understand how it works in a real world scenario, I sent a bunch of malformed JSON files as input intentionally. The state of the job is still "running" and I get the warning as expected according to the documentation (see https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-common-troubleshooting-issues). Unfortunately, If I restart sending well formatted JSON files as input, nothing happens (i.e., I would expect to have these new inputs processed correctly but it is not the case).

What can I do? Can I simply ignore the malformed input data without interfering with the processing of the following (well formatted) data?

Thanks in advance for your help.

2

2 Answers

0
votes

I suspect there is some sort of error on your side with not picking up the new messages. When a malformed message is received by the IoT hub, the message is actually removed from the queue. There should be nothing that would prevent you from receiving new, well formed, messages after a malformed message is received.

0
votes

I think I solved my issue. After a bunch of malformed input events, there is a kind of "transient phase" when new (well-formed) events are not processed in "near real-time" by the Stream job. During this phase, new events are enqueued and processed correctly with a small delay.