Thursday, April 16, 2015

Windows IoT on Galileo – Using Event Hubs and Stream Analytics



This blogs explains you on how to access Azure Event hubs from Galileo integrated with Adafruit 10-DOF IMU sensor board and using Azure stream Analytics to pipeline the sensor data stream to Azure storage services. 

Prerequisites

Integrating Adafruit 10-DOF IMU with Galileo
·         Refer the below blog post.

Creating Azure Service Bus Event hub
  • Refer Event hub getting started document , follow the steps up to 6.
  • Follow the step 6 and Create a policy named “manage policy” with manage, send and listen rights.
  • Follow the step 7 to copy the access connection string for “manage policy”.
  • Click the Consumer group tab next to configure and Click CREATE at the bottom of the page and supply a Consumer Group Name in the Create a Consumer Group dialog. This will be used when you configuring stream analytics for this event hub.

Create an Azure Storage account
This sample application requires an Azure Storage account or a storage emulator for maintaining the application state. You can use an existing Storage account or follow the steps below to create one:
  • From the portal, create a new Storage account by clicking NEW, DATA SERVICES, STORAGE, QUICK CREATE, and following the instructions.
  • Select the newly created Storage account, and then click MANAGE ACCESS KEYS at the bottom of the page.
  • Copy the Storage account name and one of the access keys.
  • Open the Azure Storage Explorer (It a third party open source tool available in code plex ) .
  • Go to add account and fill the account name and access key.

                                         Fig 1: Access storage services account
  •  Now you can access the storage account. It will shows blob containers, Queues and tables.
  •  Select the table and click the new button and create a new table named galileosensortable.

Fig 2: Creating Table through Azure storage explorer
  • It is an empty table with no need to enter the entities, since this will be automatically created by stream analytics. This table will be used when configure the output stream in stream Analytics.

Creating Azure Stream Analytics job
  • Refer “provisioning the stream Analytics job” topic in Stream Analytics getting started document, I named it as a “Galileosensorstream” for my stream analytics job in our case.
  • Refer “specify Job input” topic in the same document. Here is the details I have used for this demo. Input alias name is “galileoinputevents”.
Fig 3: Configuring job Input
  •  Configure the output job to use Azure storage table which we have created above. 
    • Click OUTPUT from the top of the page, and then click ADD OUTPUT. 
    • Select Table Storage option. 
    • Enter the Alias name as GalileoOutputStream and select your subscription where the storage account is created. 
    • Select the STORAGE ACCOUNT you have created in this subscription. 
    • Storage STORAGE ACCOUNT KEY will be filled automatically based on the account selection. 
    • Select the table and fill the partition key and row key. Below is the configuration used for our demo.
Fig 4: Configuring job output
  • Select the Query Option. Enter the required query. Here I have re direct the entire data to Table. See the below query.
Fig 5: Configuring Job Query
  • Now you can run the Stream Analytics job by Click the START as shown below. Job is ready to receive the data from Event hub (input) and pushing it to Storage service based on the query.
Fig 6: Start the Job
 Building the Galileo Wiring Application
  • Create a “Galileo Wiring App” solution, let’s named as “GalileoEventhub”. You can download it from the below codeplex link.
  • Main.cpp contains integrated source code that can access the Adafruit sensors and send the data to Azure Service bus Eventhub from Galileo. Please refer the above listed blogs to know more about the Adafruit sensor integration.
  • Modify the GalileoEventHub-WinIoT\GalileoEventHub\GalileoEventhub\GalileoEventHub.exe.config file to fill the Azure Eventhub access details before run your program. Namespace is your service bus name space, Keyname is your policy name, here is it is “Manage policy”. Copy the shared access key to key value and Event hub Name to EventHubName tag.
    Fig 7: GalileoEventHub.exe.config XML file
  • Navigate to \\mygalileo\c$\test in file explorer (create the “test” folder if necessary).
  • Copy the GalileoEventHub.exe.config XML file to \\mygalileo\c$\test folder.
  • Build the application and run it on Galileo.
  • You can see the data in the Azure storage table as created above through stream Analytics. Below picture shows the data on the Azure storage table using Azure storage explorer application.
Fig 8: Galileo sensor data on Azure storage table from Stream Analytics

Related links

2 comments:

Unknown said...

I want to test Azure Eventhub on Galileo Gen 2, and i download the project on https://galileoeventhub.codeplex.com/

Build with no error and warning. However, when i download and start debug, VS send me errors as follows:

'GalileoEventHub.exe' (Win32): Loaded 'C:\test\GalileoEventHub.exe'. Symbols loaded.
'GalileoEventHub.exe' (Win32): Loaded 'C:\Windows\System32\ntdll.dll'. Cannot find or open the PDB file.
'GalileoEventHub.exe' (Win32): Loaded 'C:\Windows\System32\kernel32.dll'. Cannot find or open the PDB file.
'GalileoEventHub.exe' (Win32): Loaded 'C:\Windows\System32\kernel32legacy.dll'. Cannot find or open the PDB file.
'GalileoEventHub.exe' (Win32): Loaded 'C:\Windows\System32\KernelBase.dll'. Cannot find or open the PDB file.

I found that thest files can be fonud under 'C:\Windows\System32' , the problem is that VS can't get them.

So would you please help me on this? Thanks! I follow you on Yammer,haha...

Vinoth said...

Hi Jiong,
I responded to your forum post. Please have a look.
Regards,
Vinoth