Microsoft is extending Azure IoT to the edge of the network
The launch of Azure IoT Edge was one of Microsoft’s slightly more esoteric but interesting announcements at its Build developer conference in Seattle today. While “the cloud” is all about moving compute and data storage into the data center, there are plenty of situations where you want to avoid the round trip between the device and the data center, or where you can’t depend on having that network connection to begin with.
With IoT Edge, Microsoft now makes it easier for developers to move some of their computing needs to these devices. IoT Edge can run on Windows and Linux and on devices as small as a Raspberry Pi with only 128MB of memory. The Microsoft services that can run on these devices include Azure Machine Learning, Stream Analytics (which came to Edge devices earlier this year), Azure Functions, Microsoft’s AI services and the Azure IoT Hub.
“Azure IoT Edge enables IoT devices to run cloud services, process data in near real-time, and communicate with sensors and other devices connected to them, even with intermittent cloud connectivity,” Microsoft’s Partner Director for Azure IoT Sam George writes in today’s announcement. “By enabling processing, analytics and action to happen closer to the source of the data, Azure IoT Edge empowers you to make faster and smarter decisions, while reducing bandwidth costs by sending only critical information to the cloud for further analysis.”
All of these devices can be managed centrally from the Azure IoT Hub and developers can write their applications in C, Node.js, Java, .NET and Python.
In some ways, Azure IoT Edge is essentially an extension of what Microsoft was already doing with its Azure IoT Gateway.
Julia White, Microsoft’s corporate VP for Azure marketing, told me that she also thinks of Azure Stack as a “super-powerful” version of an edge device. Azure Stack, which remains on track for a launch later this year, allows enterprises to run many of the core Azure services in their own data center. As White noted, Carnival Cruise Line, for example, is deploying Azure Stack on some of its ships — not to allow developers to code while they’re cruising, but to allow Carnival to write an application once and then run that same code in the cloud or on the water.