[IoT] Aquarium monitor; Sending commands back to your hardware

Let’s start with some awesome news first! All of the sources for this project are now live on GitHub! Which doesn’t mean I’m close to being finished by the way, but it does allow you to take a look and maybe even contribute should you want to. This includes all of the code from the previous blog posts as well, so go over and take a look:


Allright, so let’s get down to what this post is all about: sending commands back to your hardware. Read More

Web API controller hosted in Azure not respecting [AllowAnonymous]

Working on a project, I encountered a situation I couldn’t wrap my head around. The project includes a (rather simple) ASP.NET Web API project which is published to an Azure App Service instance. Up to now, all of the endpoints I was calling I had secured using Azure AD authentication which is a breeze to set-up. But now wanted to make one specific controller available for unauthenticated calls as well. Normally that’s rather simple, you would just add the [AllowAnonymous] attribute to the controller (or specific action) and voila; authentication would not apply to that one. So I did and published this to Azure only to be returned 401 Unauthorized responses. Hmmm.  Read More

[IoT] Replacing webjob with Azure function

This entry is part 8 of 9 in the series Azure Aquarium Monitor

In this blog post I will not be adding any new functionality to my aquarium monitor project. Instead, I’m going to replace already existing functionality. Not because that’s needed, but just because I can 😉 We’ll be looking at replacing our webjob instance using something new which is called Azure Functions.

If you didn’t read it or do not recall, check out the post I wrote on how I implemented notifications. I made use of a webjob to monitor the event hub for incoming notifications (generated by Azure Stream Analytics) and sending those to the notification hub. A few lines of code also constructed the message to send. Read More

[IoT] Stream Analytics reference data updates

If you’ve read my post on Azure Stream Analytics, you’ve seen how you can configure a reference data blob to be used to compare incoming IoT data with certain thresholds. The reference data is stored in Azure blob storage, within a certain structure of folders.

Now what about updating that file? I found that updates that I made in my blob weren’t picked up by the ASA job as I published them. The folder structure I was using was like this:


ASA will monitor that pattern for changes in the {date} and {time} parameters which align with the date and time at that moment. This way you can change the reference data right now (using DateTime.Now), but also in the future. Also, when you start a job with a date in the past, ASA will use the correct reference data depending on the date and time of the incoming stream data. More information about this can be found here: https://azure.microsoft.com/en-us/documentation/articles/stream-analytics-use-reference-data/. Read More

[IoT] Aquarium monitor; mobile app, Cordova style

This entry is part 6 of 9 in the series Azure Aquarium Monitor

Finally! Post #6 in my blog series on building an aquarium monitor solution using Azure! In my last post we created a Web API project which provides an API for any application to use. In this post, we’ll take a look at creating an Apache Cordova application for use on mobile phones. This app will consume the API and voila, we’ll then have our readings displayed in a smartphone app. This will complete the journey from reading data, sending it to the cloud and then consuming it on a mobile phone (or any other app for that matter). In a next post, I’ll describe how to build out this scenario and for instance add notifications to alert you when certain readings appear to be off.  Read More

[IoT] Aquarium monitor; WebAPI for mobile use

This entry is part 5 of 9 in the series Azure Aquarium Monitor

This is post #5 in my blog series about the subject. In the previous post I explained how Azure stream analytics processes all of the incoming messages and places them in the sinks you configure. In my case, messages are now being written to Azure storage blobs, where they reside as CSV files. In this post, we’ll take a look at the other side of things, getting out the data again for display purposes.

Read More

[Azure] Setting up the Azure billing alert preview feature

Those of you who might follow me on Twitter (@jsiegmund) might have seen this tweet about my Azure credits running out fast:

This was caused by the Azure IoT suite remote monitoring sample, which eats up your credits very fast when you leave it running. Now wouldn’t it be nice if there were some way to get alerted when your credits begin running low, before you run out completely? Well turns out Azure has such a feature in preview. It’s called “billing alert service” and you can find instructions on how to activate it here.

Note: I received the confirmation e-mail quite fast, but had to wait longer before the option actually started working in the portal. It took a couple of days before the “ALERTS” option appeared, so be patient.


Great feature for folks like me who regularly try out all kinds of features and might not keep daily track of how much credits they’re spending!

One option I am missing though is to set an alert based on your spending cap. For instance, I currently have a spending cap of 130 euro’s and would like an alert as soon as 75% of those credits have been used. Of course I can set an alert on 97.50 instead, but in my opinion it would be cleaner if you could also enter a percentage. When you agree, you can vote on this feature requests here: https://feedback.azure.com/forums/34192–general-feedback/suggestions/11586345-set-billing-alerts-based-on-spending-cap.

[SP2013] Recipe: one SP2013 development machine in Azure

When you are a developer who’s looking for an easy way to create a development machine, this is the post you want to read. Especially when you also have an MSDN subscription lying around somewhere. What we’re going to do is setup a ready to use developer box in Azure. This recipe will require approximately an hour of your time.


  • A valid Azure subscription. We’re going to create an A4 (x-large) virtual machine. When you turn it off and on again, that should fit into any MSDN subscription level.
  • An MSDN subscription.
  • Some scripts, download.
  • The Azure Powershell environment, download.
  • Your Azure publishing profile, download.


  • Downloads all the bits and install the Azure Powershell package.
  • Start the Windows Azure Powershell prompt.
  • First, make sure Azure is connected to your account. You can do this by running Add-AzureAccount.
  • Then, run Get-AzureSubscription to find the name of your subscription, you’ll need it in the next command.
  • Navigate to the folder where you downloaded the scripts.
  • Run the following command:
    .\CreateSharePointDeveloperMachine.ps1 -imageName "03f55de797f546a1b29d1b8d66be687a__Visual-Studio-2013-Premium-Update2-AzureSDK-2.3-WS2012" -azurePublishSettingsFile C:\users\repsaj\Downloads\my.publishsettings -subscriptionName "Windows Azure MSDN - Visual Studio Premium" -storageAccountName "repsaj" -vmName "repsajdev" -vmSize "ExtraLarge" -adminUserName "jasper" -adminUserPassword "Pass@word1" -localSPFarmAccountName "sp_farm" -localSPFarmAccountPassword "Pass@word1"
  • Things to review:
    • The image name includes the type of Visual Studio instance (Premium in the example), you might want to change that according to your subscription.
    • The publish settings file; this is the one you downloaded as a prerequisite.
    • Your subscription name; as stated above.
    • You should change the passwords obviously.

That’s all! Run the command and it will spin up a brand new VM in Azure. That VM will have SharePoint and Visual Studio installed. The installation will be in vanilla state, so you can configure it the way you like. My advise: start with copying a spautoinstaller folder to the machine and using that to configure the SharePoint instance. That way you have a repeatable result for the next time you want to spin up a device with similar config.

For more information about the scripts, along with instructions how to setup a box with AD or other types of machines (like Web and SQL), check out this link: http://visualstudio2013msdngalleryimage.azurewebsites.net/.

Bon apetit!


WCF service using Azure relay and ADFS authentication (1/2)

What I’m going to write about in this two-part article is what could be considered quite a common scenario. Your company wants to expose data to its employees outside of the internal network. Take a mobile app (or Windows 8 app) for instance, which gets its data from a legacy back-end system. In “old fashioned” scenario’s, you could:

  • Create a WCF webservice to host the data.
  • Deploy the webservice into some kind of secured network zone (DMZ).
  • Kindly ask the firewall admins to open up a port to your service.
  • Register a DNS address which makes it easier to call your service.
  • Request SSL certificates and secure your service so only encrypted data is sent.

In this article, I’ll write about how Azure can make your life easier by handling some of these things in a different way. You, as a reader, should be familiar with WCF and have a basic understanding of authentication schemes. Some knowledge of what ADFS is would be handy too.

There’s two Azure topics I want to talk about: relay and ACS. In the first part of this article, we’ll talk about relay. How to setup ACS with ADFS will be topic of the second part.


Setting up the service bus in Azure

Relay is a technique which enables you to register your WCF service with Azure. Once registered, clients are able to call Azure instead of your service directly. And because the service initiates the communication, you only need access to the Internet, not from it. In Azure, this is handled by the service bus, which is a multifunctional messaging system. Relay is only a part of the functionality.

Here’s what happens in general:

  1. The service registers itself with Azure .
  2. Azure acknowledges the registration. The connection is kept open for future use.
  3. A client now connects to the endpoint registered with Azure and sends a webservice request.
  4. The request is relayed to the local WCF service.
  5. The result is passed back to Azure.
  6. The service bus passes the result to the client.

For you as a developer, this has some advantages:

  • The call to Azure is https secured, so all data going to Azure is always encrypted. No need to purchase an SSL certificate.
  • Same goes for data going to the client, encrypted as well.
  • No need to open up any firewall ports. Default internet access is enough to make this work
  • Although I haven’t thoroughly tested this, it should work in all kinds of proxy-enabled scenario’s too.

Ok, cool. So what do you need to make this work? To begin with, you’ll need a Windows Azure account. Open up the management portal (https://manage.windowsazure.com/) and go to the “service bus” section. Click “Create” and choose a namespace for your service bus instance:

You can request the access key once you created the service bus instance. This key is used to authenticate server / client with the service bus. In part 2, we’re going to use ACS / ADFS for that, in which case the shared secret access key is irrelevant.

Creating a relayed WCF service

Once you’ve got your service bus namespace setup, it’s time to create a service. Most easy way is to create a console application which will use a ServiceHost object to self-host the service. The most important thing here are the bindings. You will have to use two bindings:

  • One normal binding (like BasicHttpBinding) which hosts the service in a normal way, for instance on http://localhost:1234.
  • A matching relay binding (in this case BasicHttpRelayBinding) which is used to connect to Azure and relay the service.

Assuming some WCF knowledge, creating a service host should be easy:

Registering the endpoints and bindings:

The things to notice:

  • The bindings need to match. So use a BasicHttpRelayBinding in combination with a BasicHttpBinding and not a WsHttpBinding for instance.
  • Both endpoints register the same interface class, so both endpoints know which methods your service will provide.
  • For the relay binding, you specify the service bus address with the help of the CreateServiceUri method. This asks for a scheme, the namespace you created earlier and a service path which can be anything you like.

In the example for part 2 (ADFS), we’ll switch from using a SharedSecretTokenProvider to a SamlTokenProvider used with SAML tokens.

Basically, that’s all you need to do on the server side. Simple, right? You could create one extra endpoint to provide a mex metadata exchange binding. The code to do that:

I would advise to use the mex endpoint only in development environments and leave it out for production. But that’s up to you, it doesn’t really matter that much.

Add a serviceHost.Open(), run your console app and see if it properly registers itself with Azure. Once it does, you should be able to create a client and connect with it.

Creating a client to match

Creating a client follows the same principles. The client counterpart of a ServiceHost is the ChannelFactory. Because it shares the interface definition, it’s wise to keep that class in a shared library. But you could also make use of the local endpoint and the “Add Service Reference” functionality of Visual Studio, which will generate an interface class for you. As long as the interface class has the same methods and signatures as your service has, you’re good.

Now, as we did with the service, we need to specify the endpoint behaviour:

And with that setup, you can simply create a channel and call the method of choice!

So the cool thing is, apart from some configuration and setup, it’s really not that hard to setup a relayed service. In my opinion, this is far less complicated than convincing the firewall guys to open up ports for you 😉

This service is secure. All the communications are encrypted via SSL and both client and server need to know the shared access key to be able to connect to the service bus. But this method of authentication is pretty limited and not really usable in mobile apps scenario’s (how do you prevent some unauthorized user from using the app?). So in the follow-up of this article I will explain how to extend the sample with ADFS integrated authentication. That article will also contain a full code sample.

Update: and here’s part two.