0
votes

I've been able to push down a machine learning module to an IoT edge device as per the below Microsoft tutorial. I am also able to get ML predictions from routing data from the tempSensor simulated data module to the ML module which is great.

https://docs.microsoft.com/en-gb/azure/iot-edge/tutorial-deploy-machine-learning

What i would like to do is use the Machine Learning module as a web service on the IoT Edge device - is there a way to target this module using REST etc? Something similar to the below, although i'm pushing down the ML model as a module to the IoT edge device not as a web service.

https://docs.microsoft.com/en-gb/azure/machine-learning/preview/model-management-service-deploy

EDIT: I'm not looking to deploy a ML image as a web service like the following link using "az ml service create" etc. I would like to deploy the ML image as a module via IoT Edge management and still access it via REST API. https://docs.microsoft.com/en-gb/azure/machine-learning/preview/model-management-service-deploy

Cheers, Com

1
All Azure Machine Learning models containerized as Docker-based web services can also run on Azure IoT Edge devices.Have you can refered to Deploying a Machine Learning Model as a web service? - Michael Xu - MSFT
Thanks for the reply. Yes i have. The workflow here uses this instead: az ml service run realtime -i <service id> -d "etc". I can use this on the Edge device (I assume?) after i set it up as a local environment. This would pull the docker image from the repository directly. The downside of this i believe is you cannot use it in the routing between modules which is in the IoT Edge architecture. If i use the workflow in pushing a module down via IoT Edge management (docs.microsoft.com/en-gb/azure/iot-edge/…), can i still access the REST service? - Commio

1 Answers

0
votes

Yes, the REST API for scoring your data in the Docker container on the Azure IOT Edge device can be called directly. Here is more information on consuming a web service: https://docs.microsoft.com/en-gb/azure/machine-learning/preview/model-management-consumption

If you are interested in creating your own custom AI/ML models, I would recommend going through the Azure ML Iris tutorial: https://docs.microsoft.com/en-gb/azure/machine-learning/preview/tutorial-classifying-iris-part-1

This will walk you through training a model and operationalizing it to a Docker container. You can inference by calling the REST API with your data whether the container is deployed to the cloud or on the edge.