Using the On-Premises Data Gateway to process data with Logic Apps


This post outline how to:
  • Setup the on-premises data gateway on an on-premises machine (I will be using my laptop
    Note: You shouldn’t install a gateway on a computer, such a laptop, that may be turned off, asleep, or not connected to the Internet because the gateway can’t run under any of those circumstances. In addition, gateway performance might suffer over a wireless network.)
  • Build a Logic App than consumes Azure Service Bus queue messages and sends them as text files, through the On-Premises Data Gateway, to a folder on my local machine.

Setup the On-Premises Data Gateway

  • The gateway runs as a windows service and, as with any other windows service, you can start and stop it in multiple ways. For example, you can open a command prompt with elevated permissions on the machine where the gateway is running, and then run either of these commands:
    • To stop the service, run this command:
    • To start the service, run this command:
  • Here is the windows service that was installed:
  • Note that I changed the windows service to run as a newly setup windows account (Administrator) as this is the only way I could get the logic app to authenticate with my local service:

  • You may need to ensure your firewall is open. Run the following in PowerShell in adminsitrator mode:
    Test-NetConnection -ComputerName -Port 9350

    And here is the outcome:

  • See this link for other firewall options and whitelisting:

  • Log onto Azure with the same account you used to set up the on-premise-gateway and create an on-premises gateway in Azure, connecting it to the one on your machine using the drop down at the bottom of the next picture:
  • Here is my newly created on-prem-gateway:
  • Now go and create a local windows share. I just opened it up to “everyone” for the purposes of my exercise:

Setup the Logic App

  • Create the logic app in Azure:
  • I used the peek-lock and complete template to make it easier:
  • Note that I have already created a service bus namespace and queue for this demo.
  • Now add the file connector in and tell it to use the on-premises-data-gateway:

    NOTE: The above gateway name 'MondayOnPremGW' does not match the 
    name of the gateway created at the top of this demo as explained here:I had hours of trouble will getting the above setting to connect. 
    In the end, even though I had un-installed the on-premises-gateway 
    many times and cleaned up the resouces in Azure many times
    I could still see a list of on-premises-data-gateways in Azure:
    So, finally, I just selected one that had worked for me a few weeks ago.
    In theory, you should not have the same issue if you create 1 and then use 1.
    I suspect there are some sort of remnants from previous installs left locally.

Send some messages to the service bus

  • I have some c# code to send a number of messages to the service bus.
  • For the purposes of this post I will disable the LogicApp and send 10 messages to the service bus so they are visible:
  • And the content of 1 message:
  • After enabling the Logic App the messaged in the service bus queue are consumed:
  • Here is a sample of 1 successful Logic App run:
  • And the resulting files on my local file system:
  • And here is the content of a file created from a service:


As mentioned above I had hours of trouble will getting the logic app on-premises file connector to connect to my machine.
In the end, even though I had un-installed the on-premises-gateway many times and cleaned up the resouces in Azure many times I could still see a list of on-premises-data-gateways in Azure. So, finally, I just selected one that had worked for me a few weeks ago.
In theory, you should not have the same issue if you create 1 and then use 1 only.
I suspect there are some sort of remnants from previous installs left locally.

We also saw quite a lot of throttling and have been informed by the Microsoft that less limited model is on the way soon:

The average run time was not too good but this might be due to me being on a wifi network:

Final Thoughts

Overall it was a easy to work with and setup, apart from the issue mentioned above. I think more work may need to be done on the on-premises-gateway install as it seems to hold onto data between un-installing and re-installing.

How to auto-heal your web app so you hopefully get your weekends back


This document outlines how to set up auto-healing for your web app. At times you may be enjoying your weekend and get a call telling you an app is constantly running at 100% CPU. 
Autoheal is a way to automatically recycle your app pool based on certain triggers.
I have created a POC web app with 3 instance VMs.I have turned on all diagnostic logs and set them to verbose. 
Here is the Event Log before enabling auto-heal in the web.config (there were more records but I removed them as they were all similar):

And here is the new web.config. This will recycle the app pool when we get 20 or more requests in a period of 30 seconds:

So, once this is deployed, I will call the end point 100 times in less than 30 seconds which should cause the trigger to run its action – Recycle
Here is the calling Powershell code (this just simulates a client):

$URL = ‘’;

$contentType = “application/json”;


    $i -le 100


    Write-Host ‘I is: ‘ + $i;

   $response = Invoke-RestMethod -Method Get -Uri $URL -Headers $headers -ContentType $contentType;



After I ran 100 requests in less than 30 seconds you can see in the event log that the instance RD0003FF857039 recycled:

When I really loaded it up, all 3 instance auto-healed:


Monitor your Azure API Management Instance with PowerBI

Monitor your Azure API Management Instance with PowerBI


This document outlines the steps involved to monitor your Azure API management instance with PowerBI.

Note: I will refer to Azure API management as APIM in this document.

Steps will include:

  • Add a logger, using the APIM REST API, to your APIM instance to send events to an event hub
  • Set up a Stream Analytics job – It consists of one or more input data sources, a query expressing the data transformation, and one or more output targets that results are written to. Together these enable the user to perform data analytics processing for streaming data scenarios
  • Build a PowerBI dashboard to see your APIM data in a format that suits your business requirements.

Adding a logger to APIM

First thing you need to do is add a logger to your APIM instance using the APIM REST API. I will use Postman to do this.

Firstly, in your APIM instance, enable the REST API:

Secondly, go to the bottom of the security page where you enabled the REST API and generate a shared access key:

You will use this later in Postman after we first create an Azure Event hub.

Create an Event Hub

Go into your Azure portal and create an event hub.

Ensure you create to Event Hub shared access policies. 1 for sending to the event hub and 1 for receiving. This is to allow you more granular control over your hub.

Creating the logger in Postman

Now that you have an Event Hub, the next step is to configure a Logger in your API Management service so that it can log events to the Event Hub.
API Management loggers are configured using the API Management REST API
To create a logger, make an HTTP PUT request using the following URL template.

https://{your service}{new logger name}?api-version=2014-02-14-preview

Replace {your service} with the name of your API Management service instance.
Replace {new logger name} with the desired name for your new logger. You will reference this name when you configure the log-to-eventhub policy.

Add the following headers to the request.
Specify the request body using the following template.

"type" : "AzureEventHub",
"description" : "Sample logger description",
"credentials" : {
"name" : "Name of the Event Hub from the Azure Classic Portal",
"connectionString" : "Endpoint=Event Hub Sender connection string"

 Here is mine:

You will see that it returned 201 Created which means we now have a logger.

Now we go into APIM and add a policy on the built in echo API.

This will send event to our event hub when the



OperationRetrieve resource 

API is hit.

View the Event Hub Events

If you want to view the event hub events for your own sanity check then download Service Bus Explorer and listen to your event hub:

You will notice that the event hub data shown in the listener is the same data written by our APIM policy.

Create Stream Analytics Job to send data to PowerBI

Next, go to your Azure portal and create a new stream analytics job.

Once it is created then create an input:

And a PowerBI output: Note that you will be prompted to authorize your PowerBI account.

Also create a query to transform the data. My query doesnt do anything special:

Then start your stream analytics job.

Now go back to your APIM instance and hit the API end point a few times. Make sure it is the API operation 
with the policy on it.

Also, change param1 and param2 on the operation to a few different values so we get somewhat useful data:

Now look at the trace for that operation in APIM and you will see the log to event hub event has fired:

Now log into PowerBI on the web and you will see (hopefully) a new streaming dataset:

Create a dashboard

We will create a PowerBI dashboard to visualise our APIM data using param1 and param2

I will drag a pie chart onto the workspace and set the following: All I did was add param2 as a count. As you can see we get a great visualisation of the number of times param2 was used on the APIM operation.

So you can see that I set a request with param2=5 a lot more times that I did for other calls.

Obviously you can use your imagination as to what you can use this for.

Here we see a Tree Map, Pie Chart and Funnel displaying data from my APIM. The funnel shows distinct calls from IP address.


Error WAT200: No default service configuration "ServiceConfiguration.cscfg" could be found in the project.


When deploying a cloud service / web role you may experience the following message:

2016-04-27T08:16:41.2709354Z ##[error]C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\Windows Azure Tools\2.9\Microsoft.WindowsAzure.targets(373,5): Error WAT200: No default service configuration “ServiceConfiguration.cscfg” could be found in the project.

On your VSTS build screen:

It means it is basically looking for a default cscfg file as we havent specified an alternative.

In my build I went and set the Target Profile to match the build configuration and this fixed the issue.

So it will be looking for one of the files shown below that match the currently building BuildConfiguration

And after those changes all builds perfectly:


Moving nuget package folder to a different location (1 level higher)


Note: thanks to this post by sebastian belczyk for some help:

This morning I need to relocate my nuget packages folder 1 level higher to allow an offshore team to work parallel with us. Basically, the original solution was cloaked and branched. In the original solution we realised we had 2 places for nuget packages so hence we had to merge them into 1 solution. This mean’t that the branched solution need to match the original other wise we would have lots of reintegration issues.

Firstly, I needed to update my nuget.config file. We had our package located here:
But we really needed them here:
../../packages/ – up another level.

Then I needed to close the solution and reopen it for this change to take affect. I found this out as changing the file without restarting meant that nothing changed and my packages were still restoring to the old ../packages/ directory.

Then right click to Manage nuget Packages for Solution …

Then Visual Studio will detect that there is no nuget packages folder on the file system at the location specified:

It will load them when you click restore:

So in summary, I had this file structure:
— Project
—- packages Folder

And now I have this structure:
— packages Folder
— Project


Move your IIS App pools from one machine to another


Today I got a new machine and didn’t want to have to set my IIS up again.

So I ran the following from a command prompt run as administrator:

%windir%\system32\inetsrv\appcmd list apppool /config /xml > c:\apppools.xml

This created an xml file with all my app pools in it. I deleted  6 nodes that held the default iis app pools:

  • DefaultAppPool
  • Classic .NET AppPool
  • .NET v2.0 Classic
  • .NET v2.0
  • .NET v4.5 Classic
  • .NET v4.5
Then I went to my new machine after copying over the edited apppools.xml file and ran this command:

%windir%\system32\inetsrv\appcmd add apppool /in < c:\apppools.xml

And all my app pools were added:

To export all my sites I ran this:

%windir%\system32\inetsrv\appcmd list site /config /xml > c:\sites.xml

It exported all my sites to an xml file. I edited this and removed the default web site as it was already present on the destination machine.

Then I ran this on the destination machine:

 %windir%\system32\inetsrv\appcmd add site /in < c:\sites.xml

And it imported all my sites into IIS:

That is all,


Fix already installed nuget packages


Sometimes I create a branch and for some reason 1 or 2 projects packages and hence Dlls are wrong. I even try using “Restore Nuget Packages” at solution level from the context menu but this doesn’t always fix it.

The best fix I have found is this:
Update-Package PackageName -ProjectName MyProject -reinstall

So if you have Entity Framework 6.1.2 installed and it got broken in a branch then you could run:

Update-Package EntityFramework -ProjectName MyProject -reinstall

And it will reinstall EF 6.1.2

That is all,