[Azure] Setting up continuous integration for (existing) Azure Functions

I a previous blog I showed how you might replace your existing Azure WebJobs with Azure Functions. If you’re creating these functions as part of a project, you’re bound to have some sort of source control solution in place and you might want to leverage the possibility to automatically deploy your code (aka continuous integration). Luckily Azure Functions supports this, I’ll detail how to set it up in this blog.  Read More

[IoT] Aquarium monitor; Sending commands back to your hardware

Let’s start with some awesome news first! All of the sources for this project are now live on GitHub! Which doesn’t mean I’m close to being finished by the way, but it does allow you to take a look and maybe even contribute should you want to. This includes all of the code from the previous blog posts as well, so go over and take a look:

http://github.com/jsiegmund/submerged

Allright, so let’s get down to what this post is all about: sending commands back to your hardware. Read More

Web API controller hosted in Azure not respecting [AllowAnonymous]

Working on a project, I encountered a situation I couldn’t wrap my head around. The project includes a (rather simple) ASP.NET Web API project which is published to an Azure App Service instance. Up to now, all of the endpoints I was calling I had secured using Azure AD authentication which is a breeze to set-up. But now wanted to make one specific controller available for unauthenticated calls as well. Normally that’s rather simple, you would just add the [AllowAnonymous] attribute to the controller (or specific action) and voila; authentication would not apply to that one. So I did and published this to Azure only to be returned 401 Unauthorized responses. Hmmm.  Read More

[IoT] Replacing webjob with Azure function

This entry is part 8 of 9 in the series Azure Aquarium Monitor

In this blog post I will not be adding any new functionality to my aquarium monitor project. Instead, I’m going to replace already existing functionality. Not because that’s needed, but just because I can 😉 We’ll be looking at replacing our webjob instance using something new which is called Azure Functions.

If you didn’t read it or do not recall, check out the post I wrote on how I implemented notifications. I made use of a webjob to monitor the event hub for incoming notifications (generated by Azure Stream Analytics) and sending those to the notification hub. A few lines of code also constructed the message to send. Read More

[IoT] Stream Analytics reference data updates

If you’ve read my post on Azure Stream Analytics, you’ve seen how you can configure a reference data blob to be used to compare incoming IoT data with certain thresholds. The reference data is stored in Azure blob storage, within a certain structure of folders.

Now what about updating that file? I found that updates that I made in my blob weren’t picked up by the ASA job as I published them. The folder structure I was using was like this:

devicerules/{date}/{time}/devicerules.json

ASA will monitor that pattern for changes in the {date} and {time} parameters which align with the date and time at that moment. This way you can change the reference data right now (using DateTime.Now), but also in the future. Also, when you start a job with a date in the past, ASA will use the correct reference data depending on the date and time of the incoming stream data. More information about this can be found here: https://azure.microsoft.com/en-us/documentation/articles/stream-analytics-use-reference-data/. Read More

[IoT] Aquarium monitor; the Azure notification hub

This entry is part 7 of 9 in the series Azure Aquarium Monitor

Hey there! Welcome back again to post #7 in my Internet of Things aquarium monitor series. In this series I’m explaining how to use Windows 10 IoT and Azure to read out sensor data, process it and act on it. And that “act on it” part is what we’ll take a look at in this post! Read More

[IoT] Aquarium monitor; mobile app, Cordova style

This entry is part 6 of 9 in the series Azure Aquarium Monitor

Finally! Post #6 in my blog series on building an aquarium monitor solution using Azure! In my last post we created a Web API project which provides an API for any application to use. In this post, we’ll take a look at creating an Apache Cordova application for use on mobile phones. This app will consume the API and voila, we’ll then have our readings displayed in a smartphone app. This will complete the journey from reading data, sending it to the cloud and then consuming it on a mobile phone (or any other app for that matter). In a next post, I’ll describe how to build out this scenario and for instance add notifications to alert you when certain readings appear to be off.  Read More

[IoT] Aquarium monitor; WebAPI for mobile use

This entry is part 5 of 9 in the series Azure Aquarium Monitor

This is post #5 in my blog series about the subject. In the previous post I explained how Azure stream analytics processes all of the incoming messages and places them in the sinks you configure. In my case, messages are now being written to Azure storage blobs, where they reside as CSV files. In this post, we’ll take a look at the other side of things, getting out the data again for display purposes.

Read More

[Azure] Setting up the Azure billing alert preview feature

Those of you who might follow me on Twitter (@jsiegmund) might have seen this tweet about my Azure credits running out fast:


This was caused by the Azure IoT suite remote monitoring sample, which eats up your credits very fast when you leave it running. Now wouldn’t it be nice if there were some way to get alerted when your credits begin running low, before you run out completely? Well turns out Azure has such a feature in preview. It’s called “billing alert service” and you can find instructions on how to activate it here.

Note: I received the confirmation e-mail quite fast, but had to wait longer before the option actually started working in the portal. It took a couple of days before the “ALERTS” option appeared, so be patient.

billingportal

Great feature for folks like me who regularly try out all kinds of features and might not keep daily track of how much credits they’re spending!

One option I am missing though is to set an alert based on your spending cap. For instance, I currently have a spending cap of 130 euro’s and would like an alert as soon as 75% of those credits have been used. Of course I can set an alert on 97.50 instead, but in my opinion it would be cleaner if you could also enter a percentage. When you agree, you can vote on this feature requests here: https://feedback.azure.com/forums/34192–general-feedback/suggestions/11586345-set-billing-alerts-based-on-spending-cap.

[SP2013] Recipe: one SP2013 development machine in Azure

When you are a developer who’s looking for an easy way to create a development machine, this is the post you want to read. Especially when you also have an MSDN subscription lying around somewhere. What we’re going to do is setup a ready to use developer box in Azure. This recipe will require approximately an hour of your time.

Ingredients:

  • A valid Azure subscription. We’re going to create an A4 (x-large) virtual machine. When you turn it off and on again, that should fit into any MSDN subscription level.
  • An MSDN subscription.
  • Some scripts, download.
  • The Azure Powershell environment, download.
  • Your Azure publishing profile, download.

Recipe:

  • Downloads all the bits and install the Azure Powershell package.
  • Start the Windows Azure Powershell prompt.
  • First, make sure Azure is connected to your account. You can do this by running Add-AzureAccount.
  • Then, run Get-AzureSubscription to find the name of your subscription, you’ll need it in the next command.
  • Navigate to the folder where you downloaded the scripts.
  • Run the following command:
    .\CreateSharePointDeveloperMachine.ps1 -imageName "03f55de797f546a1b29d1b8d66be687a__Visual-Studio-2013-Premium-Update2-AzureSDK-2.3-WS2012" -azurePublishSettingsFile C:\users\repsaj\Downloads\my.publishsettings -subscriptionName "Windows Azure MSDN - Visual Studio Premium" -storageAccountName "repsaj" -vmName "repsajdev" -vmSize "ExtraLarge" -adminUserName "jasper" -adminUserPassword "Pass@word1" -localSPFarmAccountName "sp_farm" -localSPFarmAccountPassword "Pass@word1"
  • Things to review:
    • The image name includes the type of Visual Studio instance (Premium in the example), you might want to change that according to your subscription.
    • The publish settings file; this is the one you downloaded as a prerequisite.
    • Your subscription name; as stated above.
    • You should change the passwords obviously.

That’s all! Run the command and it will spin up a brand new VM in Azure. That VM will have SharePoint and Visual Studio installed. The installation will be in vanilla state, so you can configure it the way you like. My advise: start with copying a spautoinstaller folder to the machine and using that to configure the SharePoint instance. That way you have a repeatable result for the next time you want to spin up a device with similar config.

For more information about the scripts, along with instructions how to setup a box with AD or other types of machines (like Web and SQL), check out this link: http://visualstudio2013msdngalleryimage.azurewebsites.net/.

Bon apetit!