Using the On-Premises Data Gateway to process data with Logic Apps


This post outline how to:
  • Setup the on-premises data gateway on an on-premises machine (I will be using my laptop
    Note: You shouldn’t install a gateway on a computer, such a laptop, that may be turned off, asleep, or not connected to the Internet because the gateway can’t run under any of those circumstances. In addition, gateway performance might suffer over a wireless network.)
  • Build a Logic App than consumes Azure Service Bus queue messages and sends them as text files, through the On-Premises Data Gateway, to a folder on my local machine.

Setup the On-Premises Data Gateway

  • The gateway runs as a windows service and, as with any other windows service, you can start and stop it in multiple ways. For example, you can open a command prompt with elevated permissions on the machine where the gateway is running, and then run either of these commands:
    • To stop the service, run this command:
    • To start the service, run this command:
  • Here is the windows service that was installed:
  • Note that I changed the windows service to run as a newly setup windows account (Administrator) as this is the only way I could get the logic app to authenticate with my local service:

  • You may need to ensure your firewall is open. Run the following in PowerShell in adminsitrator mode:
    Test-NetConnection -ComputerName -Port 9350

    And here is the outcome:

  • See this link for other firewall options and whitelisting:

  • Log onto Azure with the same account you used to set up the on-premise-gateway and create an on-premises gateway in Azure, connecting it to the one on your machine using the drop down at the bottom of the next picture:
  • Here is my newly created on-prem-gateway:
  • Now go and create a local windows share. I just opened it up to “everyone” for the purposes of my exercise:

Setup the Logic App

  • Create the logic app in Azure:
  • I used the peek-lock and complete template to make it easier:
  • Note that I have already created a service bus namespace and queue for this demo.
  • Now add the file connector in and tell it to use the on-premises-data-gateway:

    NOTE: The above gateway name 'MondayOnPremGW' does not match the 
    name of the gateway created at the top of this demo as explained here:I had hours of trouble will getting the above setting to connect. 
    In the end, even though I had un-installed the on-premises-gateway 
    many times and cleaned up the resouces in Azure many times
    I could still see a list of on-premises-data-gateways in Azure:
    So, finally, I just selected one that had worked for me a few weeks ago.
    In theory, you should not have the same issue if you create 1 and then use 1.
    I suspect there are some sort of remnants from previous installs left locally.

Send some messages to the service bus

  • I have some c# code to send a number of messages to the service bus.
  • For the purposes of this post I will disable the LogicApp and send 10 messages to the service bus so they are visible:
  • And the content of 1 message:
  • After enabling the Logic App the messaged in the service bus queue are consumed:
  • Here is a sample of 1 successful Logic App run:
  • And the resulting files on my local file system:
  • And here is the content of a file created from a service:


As mentioned above I had hours of trouble will getting the logic app on-premises file connector to connect to my machine.
In the end, even though I had un-installed the on-premises-gateway many times and cleaned up the resouces in Azure many times I could still see a list of on-premises-data-gateways in Azure. So, finally, I just selected one that had worked for me a few weeks ago.
In theory, you should not have the same issue if you create 1 and then use 1 only.
I suspect there are some sort of remnants from previous installs left locally.

We also saw quite a lot of throttling and have been informed by the Microsoft that less limited model is on the way soon:

The average run time was not too good but this might be due to me being on a wifi network:

Final Thoughts

Overall it was a easy to work with and setup, apart from the issue mentioned above. I think more work may need to be done on the on-premises-gateway install as it seems to hold onto data between un-installing and re-installing.

Error WAT200: No default service configuration "ServiceConfiguration.cscfg" could be found in the project.


When deploying a cloud service / web role you may experience the following message:

2016-04-27T08:16:41.2709354Z ##[error]C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\Windows Azure Tools\2.9\Microsoft.WindowsAzure.targets(373,5): Error WAT200: No default service configuration “ServiceConfiguration.cscfg” could be found in the project.

On your VSTS build screen:

It means it is basically looking for a default cscfg file as we havent specified an alternative.

In my build I went and set the Target Profile to match the build configuration and this fixed the issue.

So it will be looking for one of the files shown below that match the currently building BuildConfiguration

And after those changes all builds perfectly:


Generate Entity Framework update scripts from migrations

This is how you generate Entity Framework update scripts from migrations.

Note: this is a very simplified post that doesn’t generate a very complicated database script.

So you already have an initial database migration in your project. If you don’t go Google how to get started.

I’ll start by generating an SQL script for my initial migration.

Here is part of my initial migration in C#:

I will now generate the script for this but running this command in the Package Manager Console:

Update-Database -Script -SourceMigration: $InitialDatabase -TargetMigration: Initial

Make sure you select the correct Default Project in the dropdown shown in the above picture.

Here is the SQL script:

Now I will update my model with a new property:

I then ran the following to create my new C# migration:

Add-Migration AddedAProperty -StartUpProjectName User.DbResourceAccess

Which created this new C# file:

Next I will run this:

Update-Database -Script -SourceMigration: $InitialDatabase -TargetMigration: AddedAProperty

Which created the following script:

You could then apply this to a production database for example.
I’m not sure you would want to insert into a __MigrationHistory table on production though.


See your Azure VM deployment succeed or fail!


Yesterday we were having deployment issues due to an Azure WebRole startup task (more on that in my next post.)

We rolled back the changes and all was fine but I wanted to find the information that was logged on the server so I can trouble shoot in the future if it happens again. 

I just did a fresh deployment as a base line to prove that I was working with a successful deployment.

As it was deploying I could see log records appearing in here when logged onto my Cloud Service VM:

As this was a successful deployment I could see messages in the above mentioned Windows Azure Event log showing that nothing went wrong.

I could see the log message stating that the web site installed into IIS successfully.
I could see the successful OnStart() and OnRun() events.

Here are some screen shots:

Note that if we had diagnostics turned on we could probably see the same information inside the visual studio server explorer for our cloud service.

Not very useful when everything goes well. Ill post more when and if I get a failed deployment.


An alternative Way to Remotely Debug Azure Web Sites


I have been having trouble connecting to my Azure instances with the normal attach to debugger method:

This never works for me even when debugging in enabled.

Here is a link to a way that works and it worked the first time.

I only tested with an Azure Web App so not sure about WCF and services yet.
(More on this later)


Stop an AzureVM using Azure Automation with a schedule


UPDATE: The script mentioned in this post is now here:

In this blog I will show you how to use Azure Automation to schedule a Powershell script to stop and deallocate a VM running in Azure. 

The reason I am blogging this is because I have spent a couple of days looking at other people’s blogs and the information seems to not be quite correct. In particular, the need to use a self signed certificate from your Azure box is no longer required.

The reason you might want to do this is to save some money as when your Azure VM is stopped and deallocated, you will not be charged for it.

Firstly, I created a VM to play with called tempVMToStop as follows:

It required a username and password so I used my name. 

Once you have the VM you can remote desktop to it using the link at the bottom of the Azure portal and the username and password created in the previous step.

The next step is to add our automation script.

Now we go to automation in Azure:

Remember the goal of this blog is to automatically stop the following VM:
first we will need to create a user that is allowed to run our automation in Azure Active Directory as shown here:

Create the user to be used for automation:

Then go back into the automation section and choose Assets:

and add the automation user you just created here:

This is reasonably new as before you needed to create a self signed certificate on your VM and import the pfx file into an Asset => Credential but this is no longer needed.

Now go to the automationDemo and then choose Runbooks:

Click to create a new runbook:

Once it is created click on Author and write your script as follows:
workflow tempVMToStopRunBook

    # Specify Azure Subscription Name
    $subName = ‘XXX- Base Visual Studio Premium with MSDN’ 
    $cred = Get-AutomationPSCredential -Name “automationuser”
    Add-AzureAccount -Credential $cred
    Select-AzureSubscription -SubscriptionName $subName 

    $vm = Get-AzureVM -ServiceName $cloudServiceName -Name $vmName 
    Write-Output “VM NAME: $vm”
    Write-Output “vm.InstanceStatus: $vm.InstanceStatus”
    if $vm.InstanceStatus -eq ‘ReadyRole’ ) {
        Stop-AzureVM -ServiceName $vm.ServiceName -Name $vm.Name -Force    


Note that the subscriptionname shown as  XXX – Base Visual Studio Premium with MSDN will need to be replaced by your subscription.

Also the workflow class name must be the same as the runbook name.

Save it and then you can choose to test it or just publish it. 
I will skip to publish as I have already tested it.

Once it is published you can click start and enter the 2 param names that the script is expecting:



Now we want to see that our VM stops so here was mine before:

Once you run it you will see some output when you click Jobs in the runbook:

And then if you look back at your VM it should be stopped:

Note that as we are totally deallocating the resources, the next time you start it up, it will get an new IP address but this will be all given to you in the VM section in your portal.

The next step is to obviously schedule what we just did and also schedule a start script so we could, for example, stop our VM at the end of a business day and start it in the morning at 8am so it is ready for us to use. 

This will save some money as the VM will not be using resources overnight.

Go back to the root of your automation and add a new asset for your schedule:

Here’s one I created that will run the Power Shell script we created every day:

That’s all there is to it. 

Note that I am no expert on Azure automation so all comments and constructive criticism are welcome.


ASP.NET Membership System.Web.Providers. Can't log in to my siteon Azure but can locally.

Hi there,
I had an interesting issue last night. I deployed my local Azure emulated MVC 3 role and worker role to my Windows Azure instance. All worked perfectly locally and in particular the logon page I built allows me to log onto the secure area of my MVC 3 site using System.WebProviders (ASP.Membership).
I found an issue when I tried to log onto my remote Azure site using my deployed logon view.
The logon on the server failed and told me my user name or password was wrong.
Ok, so I then pointed my local instance ASP.NET configuration (the built in membership, role etc.. tool for setting up members and roles) at my Azure SQL database and tried to manage my members from there and it worked. 
Even when I changed my local connection string to point to my Azure SQL database and tried logging on, it worked. WEIRD!!! (ie.. local browser, remote Azure database).
So then today I fould this blog: 
and then this one:
The part of the post in particular that caught my attention was this:

I tracked it down, thanks to some info in this article by David Hoerster. The problem is that the default password hashing algorithm on Azure is different from the .NET 4.0 defaults. It is set to SHA1 on Azure, and HMACSHA256 is the new standard setting on 4.0.

The rest of the post is here:
This can be fixed by specifying the hash type explicitly in web.config. If you decide to use a method like HMACSHA256, make sure you also specify a machine key – otherwise you will run into similar problems as the autogenerated machine key will differ from server to server.
The configuration element you need to change is under :
<machineKey decryptionKey="PUT_DECRYPTION_KEY_HERE" 
validation="HMACSHA256" />
You can use this machine key generator to generate random keys in the proper format.
I will go home tonight and try this and update this post in the morning…….time passes…….ok, so went home last night and tried a few things. My password hashing algorithm was different between my local and azure. I just used the Azure machine config setting on my local and it all works fine now. Good times!!!

My weekend (I’m a geek) of deploying my new Kite Boarding site to Windows Azure with an Azure SQL database, membership and blob storage

Good Morning,
I just thought I’d post to my blog after a long period of being to lazy to do it and also being too busy at work to get around to it.
I spent part of the weekend deploying my latest side project to Azure. I won’t tell you what it’s all about until I have it all polished and shiny as right now it’s a definite work in progress.
What I have deployed:
  • MVC 3, Html5, Razor website – I’m thinking about moving to MVC 4 as I think the beta just came out.
  • The site runs off Entity Framework 4.3 and I used code first poco classes with DBContext.
  • Users and Roles are managed by 
  • My custom data is stored an an Azure Sql relational database 
I posted the following questions on MSDN and have now got the answers I need so i’ll update the questhions with answers. As I was an Azure newbie a 2 weeks ago, these questions now seem a bit lame to me!!


  • I have used an Azure SQL relational database for my database. They talk a lot about table storage. I assume I have done the right thing in using a relational database?
    Ok, so this one was easy. Whilst I could probably use table storage to store me data ( I am not ready to ditch my relational database way of thinking. My site will have many tables with relationships so for now I am playing it safe and using SQL Azure. I have, however, used blob storage ( to store the gallery images I need for my site. So, and I may be wrong here, if I had a list of people whom each had a picture attached to their relational database record, I think I could store the blob Url in a field in their record and, when rendering, iterate the records and retrieve the blob image from my storage account.
  • I have used the to manage users and roles. Once I deployed to Azure using the same database in my web.release.config for my custom data AND the membership tables it all works great. I can register a new user using the standard MVC 3 register user view and I can see this new user in my Users table inside the asp membership table.
    My question here then: and I have Googled this but cant find it, is how do I manage my users and roles on the azure server as I would when I use the asp.netwebadminfiles/default.aspx in Visual Studio? Do I, and I plan to this anyway down the line, write my own custom membership management code to do this or can I use an instance of asp.netwebadminfiles/default.aspx on the server somewhere?
    I figured this one out too. Although there are ways to use the netwebadminfiles tool to manage users and there is also third party code that can do this for you, ultimately I want to manage these users myself. I will build all my user and role code into the site.
  • When I deployed for the first time it created my web svc role on Azure AND my Sql database mentioned above. Lets say I have the site live for a few months and want to redeploy, will my membership and custom data get wiped out? What if a deploy schema changes to my database, will this be taken care of without affecting my data. I know I can use a one way data sync to get the data onto local but im sure I would not have to worry about this?Not 100% sure on this one but I now know that firstly, I don’t have to redeploy / publish the whole site, storage and database every time I make a change. I can use Visual Studio WebDeploy to push new changes to my site. If I needed to sync data from my local I think I could use the Data Sync in the Azure portal and I assume I could manage my users via SQL mambership stored procs. until I get all my membership screens written.