ODMs can choose to harden the platform through Hardware Security Modules (HSM). Microsoft has made it easy to run machine learning models at the edge. Each model responsible for inferencing can be packaged and deployed as a standard module. Developers can train their models on Azure through Data Science VMs or Azure ML Studio. Azure IoT Edge also supports running models exported from Azure's AutoML services such as custom vision. Since each model is just a container/module, new models can be quickly pushed to the edge. With Microsoft's investment in ONNX, ML models built using different frameworks may be exported to a standard format before using them for inference. Azure IoT Edge plays a crucial role in Microsoft's vision of delivering Intelligent Cloud and Intelligent Edge.
Jul-1-2018, 13:50:25 GMT