Using Azure IoT Hub to connect my home to the cloud

I’ve written about my hybrid local/cloud home automation architecture previously: in summary most of the moving parts and automation logic live on a series of Raspberry Pis on my home network, using a MQTT broker to communicate with each other. I bridge this “on-prem” system with the cloud in order to route incoming events, e.g. actions initiated via my Alexa Skills, from the cloud to my home, and to send outgoing events, e.g. notifications via Twilio or Twitter.

Historically this bridging was done using Octoblu, having a custom Octoblu MQTT connector running locally and an Octoblu flow running in the cloud for the various inbound and outbound event routing and actions. However, with Octoblu going away as a managed service hosted by Citrix, I, like other Octoblu users, needed to find an alternative solution. I decided to give Azure IoT Hub and its related services a try, partly to solve my immediate need and partly to get some experience with that service. Azure IoT isn’t really the kind of end-user/maker platform that Octoblu is, and there are some differences in concepts and architecture, however for my relatively simple use-case it was fairly straightforward to make Azure IoT Hub and Azure Functions do what I need them to do. Here’s how.

I started by creating an instance of an Azure IoT Hub, using the free tier (which allows up to about 8k messages per day), and within this manually creating a single device to represent my entire home environment (this is the same model I used with Octoblu).

After some experimentation I settled on using the Azure IoT Edge framework (V1, not the more recently released V2) to communicate with the IoT Hub. This framework is a renaming and evolution of the Azure IoT Gateway SDK and allows one or more devices to be connected via a single client service framework. It is possible to create standalone connectors to talk to IoT Hub in a similar manner to how Octoblu connectors work, but I decided to use the Edge framework to give me more flexibility in the future.

There are various ways to consume the IoT Edge/gateway framework; I chose to use the NPM packaged version, adding my own module and configuration. In this post I’ll refer to my instance of the framework as the “gateway”. The overall concept for the framework is that a number of modules can be linked together, with each module acting as either a message source, sink, or both. The set of modules and linkage are defined in a JSON configuration file. The modules typically include one or more use-case specific modules, e.g. to communicate with a physical device; a module to bidirectionally communicate with the Azure IoT Hub; and a mapping module to map between physical device identifiers and IoT Hub deviceNames and deviceKeys.

The requirements for my gateway were simple:

  1. Connect to the local MQTT broker, subscribe to a small number of MQTT topics and forward messages on them to Azure IoT Hub.
  2. Receive messages from Azure IoT Hub and publish them to the local MQTT broker.

To implement this I built a MQTT module for the Azure IoT Edge framework. I opted to forego the usual mapping module (it wouldn’t add value here) and instead have the MQTT module set the deviceName and deviceKey for IoT Hub directly, and perform its own inbound filtering. The configuration for the module pipeline is therefore very simple: messages from the IoT Hub module go to the MQTT module, and vice-versa.

The IoT Edge framework runs the node.js MQTT module in an in-process JavaScript interpreter, with the IoT Hub being a native code module that runs in the same process. Thus the whole gateway is run as a single program with the configuration supplied as its argument.

The gateway runs on a Pi with my specific deviceName and deviceKey, along with MQTT config, stored locally in a file “/home/pi/.iothub.json” that look like this:

    "deviceKey":"<deviceKey for device as defined in Azure IoT Hub>",
    "protocol":"{\"protocolId\": \"MQIsdp\", \"protocolVersion\": 3}"

The gateway can now happily send and receive messages from Azure IoT Hub but that isn’t very useful on its own. The next step was to setup inbound message routing from my Alexa Skills.

aziotIn the previous Octoblu implementation the Alexa Skills simply called an Octoblu Trigger (in effect a webhook) with a message containing a MQTT topic and message body. The Octoblu flow then sent this to the device representing my home environment and the connector running on a Pi picked it up and published it into the local MQTT broker. The Azure solution is essentially the same. I created an Azure Function (equivalent to an AWS Lambda function) using a JavaScript HTTP trigger template, that can be called with a topic and message body, this then calls the Azure IoT Hub (via a NPM library) to send a “cloud-to-device” (C2D) message to the MQTT gateway device – the gateway described above then picks this up and publishes it via the local broker just like the Octoblu connector did. I then updated my Alexa Skills’ Lambda Functions to POST to this Azure Function rather than to the Octoblu Trigger.

The code for the Azure function is really just argument checking and plumbing to call into the library that in turn calls the Azure IoT Hub APIs. In order to get the necessary Node libraries into the function I defined a package.json and used the debug console to run “npm install” to populate the image (yeah, this isn’t pretty, I know) – see the docs for details on how to do this.

If you’re wondering why I’m using both AWS Lambda and Azure Functions the reason is that Alexa Smart Home skills (the ones that let you do “Alexa, turn on the kitchen lights”) can only use Lambda functions as backends, they cannot use general HTTPS endpoints like custom skills can. In a different project I have completely replaced an Alexa Skill’s Lambda function with an Azure function (which, like here, calls into IoT Hub) to reduce the number of moving parts.

So with all of this I can now control lights, TV, etc. via Alexa like I could previously, but now using Azure IoT rather than Octoblu to provide the cloud->on-prem message routing.

logicappThe final use-case to solve was the outbound message case, which was limited to sending alerts via Twitter (I had used Twilio before but stopped this some time back). My solution started with a simple Azure Logic App which is triggered by a HTTP request and then feeds into a “Post a Tweet” action. The Twitter “connection” for the logic app is created in a very similar manner to how it was done by Octoblu, requiring me to authenticate to Twitter and grant permission for the Logic App to access my account. I defined a message scheme for the HTTP request which allowed me to POST JSON messages to it and use the parsed fields (actually just the “message” field for now) in the tweet.

I then created a second Azure Function which is configured to be triggered by Event Hub messages using the embedded Event Hub in the IoT Hub (if that all sounds a bit complex, just use the function template and it’ll become clearer). In summary this function gets called for each device-to-cloud event (or batch of events) received by the IoT Hub. If the message has a topic of “Alert” then the body of the message is sent to the Logic App via its trigger URL (copy and paste from the Logic App designer UI). I added the “request” NPM module to the Function image using the same procedure as for the iot-hub libraries above.

The overall flow is thus:

  1. Something within my home environment publishes a message on the “Alert” topic.
  2. The Azure IoT gateway’s MQTT module is subscribed to the “Alert” topic, receives the published message, attaches the Azure IoT Hub deviceName and deviceKey and sends it as an event via the IoT Hub module which sends it via AMQP to Azure IoT Hub.
  3. Azure IoT Hub invokes the second Azure Function with the event.
  4. The Function pulls out the MQTT topic and payload from the event and calls the Logic App with them.
  5. The Logic App pulls out the message payload and send this as a tweet using the pre-authorised Twitter connector.

Although all of this seems quite complex, it’s actually fairly simple overall: the IoT hub acts an a point of connection, with the on-prem gateway forwarding events to and from it, and a pair of Azure Functions being used for device-to-cloud and cloud-to-device messages respectively.


It was all going well until I discovered that the spin-up time for an Azure Function that’s been dormant for a while can be huge – well beyond the timeout of an Alexa Skill. This is partly caused by the time it takes for the function runtime to load in all the node modules from the slow backing store, and partly just slow spin-up of the (container?) environment that Azure Functions run within. A common practice is to ensure that functions are invoked sufficiently often that Azure doesn’t terminate them. I followed this practice by adapting my existing heartbeat service running on a Pi that publishes a heartbeat MQTT every 2 minutes to also have it call the first Azure function (the one that the Alexa Skills call) with a null argument; and to keep the second function alive I simply had the MQTT gateway subscribe the heartbeat topic thereby ensuring the event handler function ran at least once every 2 minutes as well.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s