1
votes

Microsoft announced support for sending data from Azure Stream Analytics to Azure Functions few days back:

https://azure.microsoft.com/en-us/blog/new-in-stream-analytics-output-to-azure-functions-built-in-anomaly-detection-etc/

I tried this but couldn't send data to Azure Functions. Is there any guide how to send data packet from IoT-hub -> Azure Stream Analytics -> Azure Functions?

The output is fine to other sources. This is the query I have:

WITH rpidata AS
(
SELECT 
*, 
DATEADD(Hour, 3, timecreated) AS FITimezone
FROM [rpi]
)
SELECT *
INTO [PowerBI]
FROM rpidata
SELECT *
INTO [storageout]
FROM rpidata
SELECT *
INTO [fnout]
FROM rpidata

The error message I get is:

Could not successfully send an empty batch to the Azure Function. Please make sure your function app name, function name, and API key are correct and that your Azure Function compiles. If all of those parameters are correct, Azure Function may be temporarily available at this time. Please try again later. Azure function returned with response code of 500: InternalServerError. It should respond with a 200, 202, or 204.

However the function is there, is running and is found automatically when I try to create the connection.

What kind of Function input I should use to receive the data? I n the example I linked function name is httptriggercsharp... Does streamjob send the data as json?

3
Did you get any diagnostic errors in stream analytics job? Can you try adding another output, like blob to confirm if the query is producing output at all? If it does, can you please check logs on azure functions side and see if there were incoming requests?Vignesh Chandramohan
Yes, azure stream analytics sends a batch of records as json to the function. The function should return a http success code after doing what it wants to do with the records.Vignesh Chandramohan
Any update?If you feel my answer is useful /helpful.Please mark it as an answer so that other folks could benefit from it.Brando Zhang
If your Function method has a route, you'll get this message. Ensure your HTTPTrigger has it set to null. Doesn't look like there is a way at present for stream analytics to bind to a custom route. [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]MMartin

3 Answers

2
votes

According to your description and error message, I guess there are something wrong with your azure functions codes(It returns a 500 error).

I suggest you could firstly check the azure function logs to find the details error message and change your codes.

Details you could refer to below images:

Open the azure function app and find monitor

enter image description here

I also create a test stream analysis to azure function. I found the stream analysis will send json to azure function.

I suggest you could also use this codes(Http trigger) to test your could get the result:

using System.Net;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");
     var content = req.Content;
    string jsonContent = await content.ReadAsStringAsync();

    log.Info(jsonContent);

    return   req.CreateResponse(HttpStatusCode.OK, jsonContent);
}

Result:

enter image description here

2
votes

Not sure if you still need this, but for future reference: The ASA job will output the data in a JSON array to your Azure function. An example ASA query like this

SELECT
   w.col1 as key1,
   w.col2 as key2,
   System.Timestamp as time
INTO
   azfunction
FROM
   [input] w;

will arrive in your Function like this

[
         {
             "key1":"value1",
             "key2":"value2",
             "time":"2017-09-04T17:51:02.7986986Z"
         },
         {
             "key1":"value3",
             "key2":"value4",
             "time":"2017-09-04T17:51:02.7986986Z"
         }
]

How many elements the JSON array will contain, depends on how you set up the Az Function output in ASA as well as on how fast events arrive in ASA. The array might only have one element or 100 depending on your scenario.

1
votes

Just to complement this answer , You can use mock data from mockaroo.

https://mockaroo.com/

To test the code mentioned above:

{
    "DeviceID": 8,
    "Temperature": 28,
    "Unit": 40,
    "TimeStamp": "2018-03-23T17:43:18.0000000Z",
    "EventProcessedUtcTime": "2018-03-23T17:44:36.5010819Z",
    "PartitionId": 0,
    "EventEnqueuedUtcTime": "2018-03-23T17:43:18.5700000Z"
},

{
    "DeviceID": 8,
    "Temperature": 66,
    "Unit": 27,
    "TimeStamp": "2018-03-23T17:43:20.0000000Z",
    "EventProcessedUtcTime": "2018-03-23T17:44:36.8143642Z",
    "PartitionId": 1,
    "EventEnqueuedUtcTime": "2018-03-23T17:43:21.0090000Z"
}