[Azure] Using multiple accounts side-by-side with Chrome

If you’re in the Microsoft Azure or Office365 space, chances are that you have a couple of accounts that you use to access these services. Your company account, MSDN subscription, maybe some customer accounts, a couple of demo tenants, etc. You’ll also know that switching between account is quite a bitch. You need to log out, log in again, lose all of your session cookies and automatically log out from other services as well. Pretty annoying.

If you’ve got Chrome installed, you’ve got the answer sitting right there on your desktop already. Open up the settings window and find the “People” section. This allows you to create multiple profiles for different users. The nice thing is that these users do not share any cookies or other session data. So a Chrome instance for user A does not interfere with a second instance for user B. It’s like running InPrivate mode, but without the need to log in again each time you fire it up.


Super simple trick, but a real time saver so I thought I’d share. Enjoy!

[O365] Using SharePoint boolean fields with Microsoft Flow

My previous post just now was on the topic of Microsoft Flow, the workflow-style application that allows you to perform “if this, than that” type logic linking different applications together. Basically, Flow provides you a way of automating actions by having a set of triggers, some logic and using API’s to perform actions. It wraps all of this in a nice and easy to use user interface, making this functionality that pretty much everyone can leverage. Power to the business!

In this post I want to show how I created a real-life flow to automate a process for expense declarations. The process is a really simple one:

  • We’ve created an Expense Declarations library on SharePoint.
  • We added a new Expense Declaration content type which has an Excel template for the declaration.
  • We also added a boolean field “Ready” which signals the expense form is ready for processing.
  • The form should now be sent to the person handling the declarations. Of course it would be even better to send it directly into an API, but unfortunately that’s not available for us.

As said, the basic elements of a flow are a trigger, some logic (conditions) and actions. Let’s go!


Defining the trigger

To create the flow, we head over to flow.microsoft.com and after signing in (or up), we begin with a blank flow. The first action we add is “When an existing item is modified”. This is because:

  1. The “created” action will fire off immediately after the form was created and is probably still empty.
  2. The action for a modified document will not contain the correct information, our Ready field will not be present. This is supposed to be changed in the future though.

So we set up the existing item modified trigger:


Note: because your library is a library, it might not show up in the suggestions. That doesn’t mean you cannot use it though, just type in the name and you should be good to go.


Creating a condition

Next, we need to set-up the condition. We want the declaration to be sent only when the Ready field is set to Yes. Because the value is stored as a boolean, the field value sent to flow will be “true”. You can check that by running your flow (trigger it from SharePoint after saving) and clicking the trigger to inspect the values coming in:


Check out the value for Ready:


So now the most straightforward thing to do would be to set up the condition like this:


But this does not work. I suspect the engine will handle “true” as a string which would give a comparison of “true == ‘true'” which is false. To fix this, put the editor in advanced mode and use the following expression: @equals(triggerBody()?[‘Ready’], bool(1)).


bool(1) will convert to ‘true’ so our comparison should now be “true == true” whenever the Ready field is set to Yes in SharePoint.


Setting up the action

Lastly, I created a simple e-mail action to send out a notification to the correct user. Ideally I wanted to add the file contents to that e-mail but that isn’t possible (yet) due to the “item modified” trigger which is not aware of a file. I tried several ways to get around this but didn’t succeed. You can probably get there with something customized like an Azure Function, but for now the plain old e-mail will do. Simply set-up an Office365 e-mail action to send out a mail to inform the correct user a new declaration has been added, paste in the link to the library and you’re set.


When I find a way to attach the file to the e-mail or send a direct link to the file, I’ll update this post!

[IoT] Aquarium monitor; controlling LED from C#

It’s been a few months now since I’ve posted the source code of Submerged on GitHub and started making some noise about it on hobbyist forums like ukaps. My main goal doing so was to gather feedback about which features people would want to see to convince them to use a solution like submerged. The number one requested feature by far: controlling LED lighting. Most aquascape tanks nowadays are lit using LED fixtures. Depending on your budget, you can buy cheap or expensive ones but basically they all do the same: control the output on a fixed number of channels.

I personally own a TC420 controller. This features 5 outputs which I use to control RGB + warm white + cold white LED strips. The controller is programmable by sticking in a USB cable and using some piece of shitty software to create timebound programs. There’s room for improvement.


Speaking at Experts Live 2016

Thrilled to announce that my session has been selected for Experts Live 2016. As the website says: “Experts live is THE event covering Microsoft Azure, Office365, Enterprise mobility suite, Operations management suite, Hyper-V and Windows”. I will be bringing a little IoT to the mix with my session on how I used Azure IoT components to build submerged. Join me November 22nd, live!


How I built Submerged with Azure Functions, IoT and Stream Analytics


Tickets are on sale now, get yours now.

Preliminary session planning: 14:45 – 15:45, Room 3




Changing jobs!

Having worked for four years at Atos now, early this year I began feeling like it might be time for something new. In these past four years I’ve learned a lot about how large companies work, having some of Hollands largest as my customers. It’s an intriguing world with it’s own problems, completely different from the small companies I used to work for before this job. Atos also gave me the chance to develop myself, shifting from being a hardcore developer to having more soft skills targeted towards advising customers and guiding them in today’s and tomorrow’s world of technology. For this I’m very thankful, really appreciated all of it.

But as they say, all good things must come to an end and so I’ve decided it was time to move on. Next to saying goodbye to a job, I’ll also be partly saying goodbye to the product I’ve worked with for so many years now. Yes, it’s time to let go of the “SharePoint Architect” title I was given 4 years ago. Never liked the architect part btw but that came with the job… Many projects with many customers and probably even more colleagues later, focusing purely on SharePoint just doesn’t do it for me any more. If you’ve kept track of my previous blog posts you probably noticed a lot more emphasis on Microsoft Azure and this is exactly what I’ll be moving to. I love the pace the Microsoft cloud platform is progressing at and how analysts like Gartner are increasingly confirming that Microsoft is a leader in this space. I’m not going to abandon Office365 completely though as I feel it’s a very important part of the Microsoft cloud offering, especially when combined with all the goodies Azure has to offer. It’s the combination that makes perfect and that allows me to still leverage part of my existing skill set.

So in my next role I’m going to shift focus a bit, focusing on developing solutions for and based on Microsoft Azure with Office365 when applicable. How exactly this will pan out I’ll see in the coming months. I’d love to help out customers in finding their way in all of the things the MS cloud has to offer. Making sure that solutions are future ready and leverage the cloud in the way they should, instead of simply shifting VMs over. Pretty excited about that and you might imagine I can’t wait to start!

In the next few weeks I’ve still got some project handovers to do and there’s a little break coming up. So that new start will be all fresh and spirited! Keep track of my blog or LinkedIn profile for more info! Talk later!

[Azure] Custom Function bindings + notification tags in Cordova apps

Previously I explained how I am using an Azure Notification Hub to send out notifications to a mobile application made with Cordova (read it here). This is cool, but in that scenario every notification was being sent out to every client. This is fine for some situations but in most cases you probably want some mechanism to send out notification to specific devices or a group of people. The most used example is news: you subscribe to a couple of subjects and receive only notifications for messages linked to one of those subjects. This post details how you can achieve this.  (more…)

[Azure] Adding more intelligence to Stream Analytics queries

If you’ve read my previous blog on Azure Stream Analytics, you know how Stream Analytics can be used to process all sorts of incoming data and send the end result to one or multiple outputs. This is particularly useful for ensuring the right data is saved, manipulating the data before saving or only filtering out data in which you’re interested. And that last category is what I used it for: notifications! The query I used previously is not very dynamic, here’s a snippet:

Works, but what if we start adding more sensor values? Hmm, we’d need to change the query each time. Not really what we want to do, right? Time for a better solution.


[Azure] Setting up continuous integration for (existing) Azure Functions

I a previous blog I showed how you might replace your existing Azure WebJobs with Azure Functions. If you’re creating these functions as part of a project, you’re bound to have some sort of source control solution in place and you might want to leverage the possibility to automatically deploy your code (aka continuous integration). Luckily Azure Functions supports this, I’ll detail how to set it up in this blog.  (more…)

[IoT] Aquarium monitor; Sending commands back to your hardware

Let’s start with some awesome news first! All of the sources for this project are now live on GitHub! Which doesn’t mean I’m close to being finished by the way, but it does allow you to take a look and maybe even contribute should you want to. This includes all of the code from the previous blog posts as well, so go over and take a look:


Allright, so let’s get down to what this post is all about: sending commands back to your hardware. (more…)

[O365] Deploying a SharePoint theme / branding using JavaScript only

With provider hosted add-ins being introduced in SharePoint 2013, the world of SharePoint devs shifted to using provisioning schemes to get their stuff in SharePoint sites. And this worked, quite well i might add. You might have read my post on SPMeta2 vs PnP (which is a bit outdated I must add). These provisioning engines allow your to provision “stuff” (files, folders, lists, contenttypes, whatever) to SharePoint. Amongst other things, they have one thing in common: they’re built on top of the CSOM (Client Side Object Model) C# SDK. This means that you are forced to run them as a stand-alone task, or deploy a provider hosted app which includes having a server up and running somewhere. So what if you do not want that?  (more…)