1
votes

I've encountered the problem where I can see that they're messages on my Azure event hub, and when I try and stream them to blob storage using stream analytics - nothing happens. It just streams constantly without any messages ever being written to the specified blob storage. I'm also using a very simple query : SELECT * INTO [Blob] FROM [InputEventHub]

When I try and sample the data I get the following error message :

"No events found for "********". Start time: 27 March 2017, 15:15:28 End time: Monday 27 March 2017, 16:16:28 Last time arrival Monday, 27 March 2017, 15:16:46 Diagnostics: While sampling data, no data was received from '4' partitions". They're over 48 thousand messages on the event hub so I know that they're events present on it. I'm also receiving no errors in the activity logs. Has anyone run into this problem?

Thanks in advance.

Damien

4
if all you need is to push data from EventHubs to AzureBlob - you can directly do this without any intermediary directly on EventHubs - docs.microsoft.com/en-us/azure/event-hubs/… - Sreeram Garlapati

4 Answers

1
votes

Ran into the same problem today. In my case, it was actually malformed JSON input. The tip off is the LastTimeArrived field is within the valid start/end range.

0
votes

I'd like to suggest few steps to investigate on this issue. Please note that it will involve starting the job (instead of using sample data).

Can you confirm you see the incoming messages in Azure Stream Analytics, and not only in Event Hub. For this, can you:

  1. Make sure your sensors/devices are sending data to Event Hub
  2. Start your ASA job
  3. Go on Azure portal, then open the "Overview" blade for your ASA job.
  4. Have a look at the monitoring chart. Double check you see some input events here when your job starts (be sure to have "Input Events" chose in the chart, if not click on Edit). The chart may need 30 seconds to 1 minute to refresh.

If there is no input event here, there is probably a problem with the connection to your hub.

If you do see input events, but no output events, can you look if you have any error in "Activity Log" blade. For further investigations you can also use our new improved troubleshooting experience announced last month.

Let me know how it works for you.

Thanks, JS (Azure Stream Analytics)

0
votes

This solution looks lengthy, but as long as you know how to type, it's SIMPLE. I promise.

I've had this issue twice with 2 separate application insights, containers, jobs, etc. Both times I solved this by editing the path pattern of my input(s) to my job.

To navigate to the necessary blade to make the following changes:

1) Click on your stream analytics job
2) Click "inputs" under the "job topology" section of the blade
3) Click your input (if multiple inputs, do this to 1 at a time)
4) Use the blade that pops up on the right side of the screen

The 4 potential solutions I've come across are ( A-D in bold):

A. Making sure the path pattern you enter is plain text with no hidden characters (sometimes copying it from the container on Azure made it not plain text).

*Steps:*

  1) Cut the path pattern you have already in the input blade

  2) Paste it into Notepad and re-copy it

  3) Re-paste it into the path pattern slot of your input

B. Append your path pattern with /{date}/{time}

Simply type this at the end of your path pattern in the blade's textbox

C. Remove the container name and the "/" that immediately follows it from the beginning of your path pattern (see picture below)

Removing the container name and / from path pattern

Should be self-explanatory after seeing the pic.

D. Changing the date format of your input to YYYY-MM-DD in the drop-down box.

Should also be self-explanatory (look at the above picture if not).

Hope this helps!!

0
votes

I received the same error when trying to use a blob .csv file as an input into Streaming Analytics Job. The issue turned out to be the file was to big.

I think the max size is 2.1MB, I think this becaUSE that is the warning I received when trying to edit the blob file IN aZURE.

See for yourself: 1) open the storage account 2) open containers 3) open the container 4) navigate to the file and click on it once, the properties blade will open. 5) Click 'Edit blob' you should see a file size warning if the file is over 2.1mb

As soon as I reduced the size of this file the "No events found for" error stopped and preview data arrived as expected in the SA Job