Logo of Homebase

Microsoft teamed up with Chris Piggott, the CTO of Homebase, to develop the next stage of the Homebase product roadmap that demonstrates how Homebase could leverage Azure IoT Hub, Azure Stream Analytics, and Power BI Embedded to support the need to store, analyze, report, and act on IoT data.

Core team:

  • Chris Piggott – CTO, Homebase
  • Martin Schray (@mschray) – Senior Technical Evangelist, Microsoft
  • Ryan Joy (@atxryan) – Senior Technical Evangelist, Microsoft
  • Jordan Svancara (@JordanSvancara) – Technical Evangelist, Microsoft
  • Gabrielle Crevecoeur (@nowayshecodes) – Technical Evangelist, Microsoft

In our solution, we used IoT Hub, Stream Analytics, and a series of other Azure services and capabilities to capture, route, process, transform, and store data from the Homebase smart property-management Internet of Things (IoT) solution that helped Homebase build momentum in delivering their product roadmap.

Customer profile

Kansas City–based Homebase partners with real-estate developers and property managers to develop and deliver the smartest multifamily communities possible. The Homebase platform makes it easy to stay in touch with your residents; manage payments and maintenance requests; and control smart devices like locks, lights, and thermostats.

Homebase is an intelligent apartment-management platform, incubated out of Think Big Partners, that simplifies the management of critical functions in the resident/landlord relationship, while unlocking the power of the IoT and building automation.

With Homebase, communication is conversational, immediate, and seamless, like text messaging. Collecting payments is automatic, and paying rent is as simple as tapping a notification. Managing maintenance is stress-free and accountable and provides a comprehensive building perspective at a glance to the property manager. All of these functions are tightly integrated with commercially available smart devices and building-automation systems to give residents and property managers full control of their units’ locks, thermostats, lights, and other connected devices.

Pain point

The Homebase technical team had built an environment that connects to an apartment or condo building’s collection of smart devices such as locks, lights, thermostats, smoke, and carbon-monoxide detectors and provide a single, integrated point for monitoring and managing these devices. The next stage in the Homebase team’s development plan was the need to aggregate data collected by each site, such as an apartment or condo building, into a cloud infrastructure. The goal was to push responsibility for processing, routing, transforming, storing, and reporting from the client to the cloud.

That cloud infrastructure would need to handle ingestion of large amounts of IoT data and to store, route, log, and analyze that data, while identifying and alerting to conditions such as low battery on a device. The Homebase team decided to use Azure and work with Microsoft in creating their cloud infrastructure to support their solution.

Solution, steps, and delivery

Homebase wanted to build their solution so that it would allow for

  • recognizing and addressing special and noteworthy conditions, such as low batteries in the range of devices covered by their solution
  • capturing a subset of incoming data to facilitate reporting and to provide clients an aggregate view of their facility or facilities
  • capturing all incoming data for archival and later analytical purposes

Homebase decided to build an end-to-end solution, but, in agile startup fashion, wanted to quickly build and test interactively during the process of development leading to full implementation. To facilitate this approach, Homebase decided to start by streaming data generated by electronic door locks into Azure and working with Microsoft to build out a complete processing flow for this data.

The team used the standard IoT Hub approach to securing devices. Each device needs to be registered with IoT Hub before it can be used, and each device has a device-specific connection string.

IoT Hub was the collection point for data flowing into Azure. Although language support exists for reading IoT hubs from Java and .NET, as well as via REST APIs, the team decided to use Stream Analytics. Stream Analytics provides an SQL-like query language that can read data from an IoT hub and distribute that data to data stores such as Azure SQL Database (a cloud-managed SQL server as a database service), Azure SQL Data Warehouse, Azure Table storage, Azure Data Lake, and Azure DocumentDB (a cloud-managed JSON document store as a service). The team decided to write hot-path (critical) data to SQL Database to facilitate easy reporting and cold-path (archival) data to DocumentDB to capture a replica of all incoming data.

Part of the reason the team selected SQL Database for hot-path data was the power of SQL to manipulate and query data as well as the availability of a large talent pool that knows how to use SQL. Additionally, the team decided to use Power BI Embedded for reporting; Power BI Embedded supports SQL Database via DirectQuery. DirectQuery supports talking directly to SQL Database (rather than caching static data), so Power BI Embedded reports would immediately reflect the latest data. Finally, Stream Analytics supports SQL-like capabilities such as CROSS APPLY, which enabled the team to unroll JavaScript arrays into multiple SQL rows, transforming incoming data to allow for easier reporting. The team decided to use DocumentDB for cold-path storage because it natively supports the JSON format of the incoming data and efficiently stores (and retrieves) high volumes of data inexpensively, which is perfect for the Homebase desire to archive incoming data.

Together the team designed the following architecture, which they then created on Azure.

Homebase architecture

The Homebase team wanted to minimize the amount of processing done “on site” by their client application, so they decided to delegate this processing to Azure. The following diagram illustrates the flow that addresses a low-battery condition on a lock; other devices would have similar flows. The Homebase solution streams all data into IOT Hub. A series of Stream Analytics jobs serve various purposes. In the low-battery flow, a Stream Analytics job with a query captures and routes low-battery conditions on locks. These conditions are routed to an Azure Service Bus topic that triggers Azure Functions to route these low-battery inputs through the IOT Hub cloud-to-device capability to send a message to the Homebase on-site solution for notification and handling. This allows Homebase to send all data to Azure for processing and storage and to route only data needing intervention and action by their client to be routed back to the client for processing.

Flow for low-battery condition

The two other flows store data in two other data sinks. The first (seen in the following diagram) selects a subset of records and stores them in SQL Database. The data stored in SQL Database is hot-path data that Homebase wants available for immediate use and reporting.

Flow for hot-path data

The last major flow of data, pictured in the following diagram, uses Stream Analytics to capture all data ingested by IoT Hub and write that data to DocumentDB. This flow archives all data for later use or analysis.

Flow for cold-path data

Technical details

Homebase is a growing startup, so the team decided to hold several four-hour hackfests to allow significant progress to be made in building out portions for their product roadmap, while not interrupting other ongoing development and customer-engagement activities at Homebase. After each hackfest the team reviewed what was completed, what still needed to be done, and any questions or issues that may have arisen. This format worked well because it didn’t require large blocks of time from either Homebase or the Microsoft team, while the team quickly built the solution.

Azure IoT Hub and Azure Stream Analytics

Homebase designed a hub in-house using readily available hardware such as the Raspberry Pi. The device has 1 GB of memory and 8 GB of storage, uses line power, and has Wi-Fi for connectivity. It is running Linux. Homebase is using the Azure IoT Gateway SDK and writing their device app in Node.js.

At the hackfest, we used the Azure Resource Group that Chris, the Homebase CTO, had created for our hackfest. Using the Azure portal, we created our IoT hub, our managed SQL database, Stream Analytics jobs, Service Bus topic, Azure Functions, and our DocumentDB database. The Homebase team registered their Homebase solution as an IoT device using the Azure Device Explorer for IoT Hub devices. Next, the Homebase team modified their client to format data into a JSON message and write it to IoT Hub. Based on the JSON data being streamed into IoT Hub, we began to design and build our Stream Analytics jobs. Stream Analytics jobs has three parts: one or more inputs, one or more Stream Analytics queries, and one or more outputs.

Azure Stream Analytics job topology

Azure Stream Analytics job topology

Ultimately, several types of records will be written to the Homebase IoT hub; currently there are three different sinks for incoming data (Service Bus, SQL Database, and DocumentDB). Because Stream Analytics components (inputs, queries, or outputs) cannot be modified while a Stream Analytics job is running, we decided to create a separate Stream Analytics job for each of the sinks we are using. This would allow a Stream Analytics job for a particular record type or sink to be stopped and edited while the others continued to be processed. Without looking at the specific details for the following Stream Analytics query, note the familiar SQL elements of SELECT, INTO, and FROM. Additionally, note the use of the familiar WHERE clause to allow processing of a single record of type RESULT in this query. Also note the TRY_CAST that casts incoming data to a different data type.

SELECT *

INTO [toSBTopicLowBattery]

FROM [IOTHub]

WHERE HomeBaseAiHubMessage.record_type = 'lock' and TRY_CAST (HomeBaseAiHubMessage.value AS float) <= 0.2

For the Stream Analytics jobs storing results in the SQL database, we made use of the built-in functions (like TRY_CAST) to transform the data so we could easily recreate reports from the incoming data. Stream Analytics provides built-in functions for arrays, strings, creating cross products from arrays, and casting incoming data to SQL types.

Additionally, we created a separate Stream Analytics job to output all record types to DocumentDB. Again, creating separate Stream Analytics jobs allows them to be started, stopped, and modified independently. Because DocumentDB is a JSON document store, we were able to directly output the JSON documents pulled from IoT hub directly into DocumentDB. Because DocumentDB was being used for archival of all incoming data, we did not perform any transformations on the data.

Azure Functions

Next we built out the capability to handle the special-case inputs that required routing to the client for alerting, notification, and action. To do this we used Azure Functions and Service Bus.

Azure Functions allowed nearly immediate productivity because we were able to start writing code (in C#, Node.js, PHP, Python, Bash, and many others) without setting up a server infrastructure or application framework. The Azure Functions environment allows you write code, test code, and monitor code. Additionally, Azure Functions enable you to

  • quickly bind to various triggers, such as schedules, REST or webhooks, blob storage, events, queues, timers, and Service Bus queues or topics
  • quickly read from inputs like blob storage, tables, and DocumentDB
  • quickly bind to various output tables, such as DocumentDB and push notifications

We choose to write our Azure Functions in C#. Azure Functions allowed the team to easily incorporate shared code (code that can be used across several functions). The following example shows the loading of two shared-code components, FunctionNameHelper and LoggingHelper, using the #load syntax. FunctionNameHelper and LoggingHelper were shared code that provided logging services for each of our Azure Functions.

#load "../Shared/FunctionNameHelper.csx"
#load "../Shared/LoggingHelper.csx"

#r "Newtonsoft.Json"

using System;
using System.Threading.Tasks;
using Microsoft.Azure.Devices;
using System.Text; 
using Newtonsoft.Json;

Although Azure Functions has many triggers, input, and outputs, the output we needed—IoT Hub for cloud-to-device messages—is not yet supported by Azure Functions. However, Azure Functions allows for easy inclusion of libraries via the project.json file. By adding project.json as needed to our various Azure Functions, we were able to pull in NuGet packages. NuGet provides package management for .NET apps, providing our team per-built functionality that we leveraged in our solution. In our case, using NuGet packages for IoT Hub and its supported capabilities, we were easily able to add libraries supporting our use of IoT Hub and its support of cloud-to-device messages. We used the following project.json file to add NuGet packages to Azure Functions. Similarly, in Node.js you can specify package.json to leverage NPM packages.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "Microsoft.WindowsAzure.ConfigurationManager": "3.2.3",
        "Microsoft.AspNet.WebApi.Client":"5.2.3",
        "Microsoft.AspNet.WebApi.Core":"5.2.3",
        "Microsoft.Azure.Amqp": "2.0.3",
        "Microsoft.Azure.Devices":"1.2.1",
        "Microsoft.Azure.Devices.Shared":"1.0.5"
      }
    }
  }
}

Azure Service Bus provides a reliable message-delivery service. Service Bus is a brokered communication mechanism allowing messages to be thought of as asynchronous, or “temporally decoupled.” We used a producer (sender) and consumer (recipient) pattern to decouple our Stream Analytics job output (producer) and Azure Functions (consumer) by writing outputs to a Service Bus topic. This effectively allowed the producer and the consumer to operate and be scaled independently.

As shown in the following image, using Azure Functions made it easy to bind to a Service Bus topic. Any time an entry was written to a Service Bus topic, the corresponding Azure function fired.

Azure Service Bus trigger

Because Homebase wanted to delegate client processing, filtering, and routing of events to the cloud, some messages (such as low-battery warnings) that reached a certain critical threshold level should be routed back to the Homebase client to alert and notify property managers. We used Service Bus as the reliable message-delivery service to route messages that the client needed to be notified of from IoT Hub via Stream Analytics to the Service Bus. Then an Azure function is called on the arrival of a message to Service Bus to fire a cloud-to-device message. This allowed us to route specifically identified messages (for example, low battery in a smoke detector) to the HomeBase client so it could receive cloud-to-device messages from IoT Hub.

To receive cloud-to-device messages, the Homebase team extended their Node.js client application to listen for cloud-to-devices messages that are routed to that client. When a message is received, the Homebase client application processes the message and kicks off notification to the property manager. The follwoing Azure function is called each time a message is sent to Service Bus. The message from Service Bus is passed in the mySbMsg parameter. After getting the device that sent the message to IoT Hub, a connection is made to IoT Hub via a connection string and a cloud-to-device message is sent to that client.

#load "../Shared/FunctionNameHelper.csx"
#load "../Shared/LoggingHelper.csx"

#r "Newtonsoft.Json"

using System;
using System.Threading.Tasks;
using Microsoft.Azure.Devices;
using System.Text; 
using Newtonsoft.Json;

public static void Run(string mySbMsg, TraceWriter log)
{
    LoggingHelper.WriteLogMessage(log, FunctionNameHelper.GetFunctionName(),$"Called with ServiceBus Message {mySbMsg}");
    try 
    {  
        // incoming ServiceBus Message has the id of the originating device in ConnectionDeviceId":"HomeBase.AI.Building1"
        dynamic sbMessage = JsonConvert.DeserializeObject(mySbMsg);

        // Grab the device ID - we'll use this to route back to the sending device
        string deviceID = sbMessage.IoTHub.ConnectionDeviceId;

        LoggingHelper.WriteLogMessage(log, FunctionNameHelper.GetFunctionName(),$"DeviceId is {deviceID}");

        // serviceClient is the IoT Hub
        ServiceClient serviceClient;

	// Read the connection string from App Setting and connect to IoT Hub
        var connectionString = Environment.GetEnvironmentVariable("AZURE_IOTHUB");
        serviceClient = ServiceClient.CreateFromConnectionString(connectionString); 

        // Format message to send to client
        string msgToEncode = $"\"HomeBaseAiHubMessage\":{sbMessage.HomeBaseAiHubMessage.ToString()}"+"}";

        LoggingHelper.WriteLogMessage(log, FunctionNameHelper.GetFunctionName(),$"HomeBaseAiHubMessage is {msgToEncode}");

        // Send cloud-to-device message
        var commandMessage = new Message(Encoding.ASCII.GetBytes(msgToEncode));
        serviceClient.SendAsync(deviceID, commandMessage);

        LoggingHelper.WriteLogMessage(log, FunctionNameHelper.GetFunctionName(),"SendAsync Completed");

    }
    catch (Exception ex)
    {
        LoggingHelper.WriteLogMessage(log, FunctionNameHelper.GetFunctionName(),$"Message: {ex.Message} and stacktrace {ex.StackTrace}");
    }

}

Power BI Embedded

Now that we had pulled data from IoT Hub and stored it in SQL Database and DocumentDB, and routed special conditions to the client, we turned our attention to the reporting needs of Homebase. Homebase wanted to create reports for its customers and embed these reports into its customer web application. To create either Power BI reports (for consumption via an Office 365 portal) or Power BI Embedded reports for embedding in applications or web pages, you use the free Power BI Desktop application. We downloaded the application and connected it to the Homebase database to create the needed reports. To pursue the idea of least privilege, we created a new user for reporting that had only the SQL Server db-reader privileges, which is all that is needed for Power BI Embedded reports. Power BI Desktop enables technical and non-technical users to create sophisticated and engaging reports, with capabilities such as sorting and filtering data.

When we finished creating our first report, we saved it into the Power BI Desktop .pbix file format. We could publish the Power BI report to the Homebase Office 365 portal for their own use, but because Homebase wanted to provide these reports for its customers, we published them to a Power BI workspace and embedded them in a web app. Although development is under way to allow publishing of Power BI Embedded directly through the Azure portal, this functionality was not complete at the time of this writing. However, the Power BI command-line interface (CLI) tool, shown in the following image, made it quite straightforward to publish our Power BI Embedded report to the Power BI workspace.

Help text from Power BI CLI

We proceeded through the following steps to prepare our reports for use:

  1. Create Power BI workspace in Azure
  2. Create a report in Power BI Desktop using SQL Database as DataSource
  3. Enforce row-level security in the report where applicable
  4. Upload resulting .pbix file into the Power BI workspace
  5. Add DataSource credentials to DataSet
  6. Obtain appropriate OAuth token
  7. Embed report in web page using OAuth token

As the following diagram shows, a user of the Homebase web application requests a report. After being authenticated against Azure Active Directory, that request goes to the Power BI workspace in Azure. The Power BI Workspace routes a query request to our Azure SQL Database.

Flow for Power BI Embedded report

For our Embedded Power BI reports, we chose to use DirectQuery, which means our query is routed to SQL Database to query live data. Rather than importing data into the report, which would mean having to periodically update our report, we used DirectQuery, so we always have a view of current data in the system. As shown in the following image, when we were setting up the Power BI Desktop connection to SQL Database we chose DirectQuery to enable the DirectQuery capability.

Setting up DirectQuery

With the Power BI report published to the Power BI workspace, we built a web app and embedded the report in the web page. The web app was written in Node.js and deployed to Azure.

Power BI Embedded report in web page

Architecture

We leveraged a number of Azure components in building this solution, including:

  • Azure IoT Hub
  • Azure Stream Analytics
  • Azure Functions
  • Azure Service Bus
  • Azure SQL Database
  • Azure DocumentDB
  • Web Apps feature of Azure App Service (Azure platform as a service, or PaaS)
  • Azure Embedded Power BI workspace
  • Azure Active Directory
  • Azure Storage
  • Azure resource groups

The architecture can be seen in the following diagram.

Homebase architecture

Eventually, the Homebase team will direct more record types and other monitoring conditions that will require additional cloud-to-device messages and build out additional activities on Azure.

Key learnings

  • The inputs, queries, and outputs in a Stream Analytics job can be edited only when the job is stopped, so it makes sense to create different jobs for different data stores. In our case, we created a separate Stream Analytics job for each SQL table we worked with, one for output to Service Bus and another for DocumentDB.
  • By default, a single channel is created when an IoT hub is created. Each channel is limited to five readers. When creating an IoT hub (or after, if needed), we recommend creating a channel for each anticipated Stream Analytics job.
  • The Stream Analytics query language has a number of capabilities for converting, working with strings, and working with arrays. Because IoT data is often optimized for compactness, it is useful to have these built-in functions to help reformat data for downstream usage.
  • Stream Analytics also allows for JavaScript user-defined functions (UDFs) to be defined and used as part of Stream Analytics queries. For example, you could use a UDF to convert inbound hexadecimal input into decimal for storage in a database.
  • Although at the time of this writing you cannot upload a Power BI Embedded report via the Azure portal, the Power BI CLI allows you to easily accomplish this task.
  • The IoT hub can be read by .NET and Java clients via REST APIs as well as Azure Stream Analytics. Stream Analytics is a quick and effective mechanism for extracting data from the IoT hub and getting this data to downstream data sinks and systems.
  • You can start a Stream Analytics job at a specific date and time, now or from the last stop of the job.
  • By default, data written to the IoT hub stays in the hub for 24 hours. Because you can start Stream Analytics at a specific data and time, you can write data to an IoT hub and use this data repeatedly over a 24-hour period for testing.
  • When writing the SELECT statement in Stream Analytics queries, you rename columns and cast data as you writer to the datastore.
  • Azure Functions provides serverless computing, and we used this capability to quickly define a trigger that a record written to a Service Bus topic would trigger an Azure function to send a cloud-to-device message.
  • Service Bus provides a reliable message-delivery service. Service Bus is a brokered communication mechanism, similar to a postal service in the physical world. Postal services make it easy to send various kinds of letters and packages with a variety of delivery guarantees, anywhere in the world.
  • Azure Functions makes it quick and easy to develop new capabilities without worrying about servers, virtual machines, or infrastructure.
  • When a Azure continuous deployment completes a deployment, you can easily fire webhooks as a notification mechanism.

Conclusion

Insights

  • The Homebase team was able to quickly modify their existing Node.js implementation to stream data into the IoT hub and receive cloud-to-device messages. As the data was being streamed into the IoT hub, we were able to quickly define the inputs, outputs, and queries to move data from the IoT hub into the appropriate sinks, such as SQL Database, DocumentDB, and Service Bus.
  • Our hackfest team was composed of local and remote team members in multiple geographic locations. By carefully and thoughtfully defining approaches, desired goals, and outcomes before starting; designing small testable tasks; and establishing predefined team check-ins, the distributed hackfest team was effective and productive.
  • Data coming from IoT devices is often optimized for small transmission sizes using techniques such as bitmasks. Although this was not the case for Homebase, we did use some of the Stream Analytics utility functions to manipulate strings and arrays, do type conversion, and apply aggregate functions.

How the learnings and insights can be applied elsewhere

A key factor in successful collaboration is to jointly design the goals and desired outcomes and determine the preliminary research and prerequisites before a hackfest. Chris Piggott and the Microsoft team had a number of planning calls leading up to the hackfest. During these calls, we focused on designing a small but complete end-to-end flow from the client through IoT Hub to the required data sinks and routing messages back to the client via cloud-to-device as needed. By taking a “horizontal slice” (for example, one message type and all of its needs), we were quickly able to build and being testing our end-to-end solution.

To accelerate things for the hackfest, some steps could have been done in code, such as registering devices and setting up, configuring, and publishing Power BI Embedded reports. However, given the short timeframe of the hackfest, UI tools (in the case of device registration) and a CLI tool (in the case of Power BI Embedded) were much easier and quicker than writing code for these tasks.

Being an agile startup, Homebase needed to quickly build and test a cloud architecture that allowed the streaming of IoT data from their solution to the cloud for capturing, processing, routing, and analytics. During this series of short hackfests, we were able to quickly build an end-to-end flow that allowed Homebase to

  • send IoT data to IoT Hub
  • identify and route data requiring special attention and notification to the Homebase solution for appropriate processing
  • route incoming IoT data to specific data sinks based on the incoming data
  • enable custom reporting for Homebase customers
  • build out a approach and model to support additional types of data and reacting to notable conditions in that data

Closing

During the hackfest, we accomplished each of these goals and laid the foundation for the full suite of Homebase IoT data to be ingested, routed, processed, and analyzed on Azure.

Chris Piggott, the Homebase CTO, used the Homebase product roadmap as an opportunity to build needed new capabilities on Azure. Managed services in Azure such as DocumentDB, SQL Database, Service Bus, Power BI Embedded reporting, Azure Functions, and Azure App Service greatly accelerated what we were able to build and accomplish. These managed services enabled us to quickly create and immediately begin using services foundational to our solution. As beneficial as quickly creating and using the services can be, the real benefit is that upkeep, patching, and maintenance is handled by Microsoft for each managed service. Additionally, these managed services can be easily scaled or even auto-scaled as demand grows.

After the hackfest, Chris said, “The time it took to go from an idea to a fully functioning, end-to-end product was incredible. It was extremely easy to setup Azure IoT Hub and start streaming data into it, as well as start receiving messages back from IoT Hub, turning data into actionable information to be used in our solution.”

Additional resources