Stop an AzureVM using Azure Automation with a schedule

Hello,

UPDATE: The script mentioned in this post is now here:
https://gallery.technet.microsoft.com/scriptcenter/Stop-Azure-VM-with-OrgID-41a79d91

In this blog I will show you how to use Azure Automation to schedule a Powershell script to stop and deallocate a VM running in Azure. 


The reason I am blogging this is because I have spent a couple of days looking at other people’s blogs and the information seems to not be quite correct. In particular, the need to use a self signed certificate from your Azure box is no longer required.


The reason you might want to do this is to save some money as when your Azure VM is stopped and deallocated, you will not be charged for it.


Firstly, I created a VM to play with called tempVMToStop as follows:



It required a username and password so I used my name. 

Once you have the VM you can remote desktop to it using the link at the bottom of the Azure portal and the username and password created in the previous step.


The next step is to add our automation script.

Now we go to automation in Azure:



Remember the goal of this blog is to automatically stop the following VM:
first we will need to create a user that is allowed to run our automation in Azure Active Directory as shown here:

Create the user to be used for automation:







































Then go back into the automation section and choose Assets:


and add the automation user you just created here:







































This is reasonably new as before you needed to create a self signed certificate on your VM and import the pfx file into an Asset => Credential but this is no longer needed.

Now go to the automationDemo and then choose Runbooks:






















Click to create a new runbook:

Once it is created click on Author and write your script as follows:
workflow tempVMToStopRunBook
{
    Param
    (   
        [Parameter(Mandatory=$true)]
        [String]
        $vmName,       

[Parameter(Mandatory=$true)]
        [String]
        $cloudServiceName 
    )
    # Specify Azure Subscription Name
    $subName = ‘XXX- Base Visual Studio Premium with MSDN’ 
    
    $cred = Get-AutomationPSCredential -Name “automationuser”
    
    Add-AzureAccount -Credential $cred
   
    Select-AzureSubscription -SubscriptionName $subName 

    $vm = Get-AzureVM -ServiceName $cloudServiceName -Name $vmName 
        
    Write-Output “VM NAME: $vm”
    Write-Output “vm.InstanceStatus: $vm.InstanceStatus”
    
    if $vm.InstanceStatus -eq ‘ReadyRole’ ) {
        Stop-AzureVM -ServiceName $vm.ServiceName -Name $vm.Name -Force    
    }

}

Note that the subscriptionname shown as  XXX – Base Visual Studio Premium with MSDN will need to be replaced by your subscription.

Also the workflow class name must be the same as the runbook name.

Save it and then you can choose to test it or just publish it. 
I will skip to publish as I have already tested it.









































Once it is published you can click start and enter the 2 param names that the script is expecting:

    Param
    (   
        [Parameter(Mandatory=$true)]
        [String]
        $vmName,       

[Parameter(Mandatory=$true)]
        [String]
        $cloudServiceName 
    )






























Now we want to see that our VM stops so here was mine before:











Once you run it you will see some output when you click Jobs in the runbook:




































And then if you look back at your VM it should be stopped:
















Note that as we are totally deallocating the resources, the next time you start it up, it will get an new IP address but this will be all given to you in the VM section in your portal.

The next step is to obviously schedule what we just did and also schedule a start script so we could, for example, stop our VM at the end of a business day and start it in the morning at 8am so it is ready for us to use. 

This will save some money as the VM will not be using resources overnight.

Go back to the root of your automation and add a new asset for your schedule:







































Here’s one I created that will run the Power Shell script we created every day:












































That’s all there is to it. 

Note that I am no expert on Azure automation so all comments and constructive criticism are welcome.

thanks
Russ

Advertisements

Debugging an custom object using Powershell in the Package Manager Console window.

All,

I was just trying to find out what data my collection of custom objects was getting hydrated with. the context doesn’t matter but for the record, I was hitting a Sitecore index and duplicates were being rendered in my UI.

I started using the immediate window but it had limitations that were preventing me from getting what I needed.

I wanted to see if all the objects that were hydrated that contained the word “test” hence the “test*”.

Anyway, here is what I came up with:

for($i = 0; $i -lt 1000; $i++) // 1000 is a bit high so adjust for your needs
$a = $dte.Debugger.GetExpression(“(()results[$i]).Name”); 

if ($a.Value -match “test*”) 
Write-Host $a.Value 
}

Here are the results:

  • “testcampaignspeed73” 
  • “testcampaignspeed74”
  • “testcampaignspeed2”
  • “testcampaignspeed23”


And here is the PMC window:










I just wanted to put this out there for myself to remember and for anyone else who needs it.

thanks
Russ 

Add SendGrid email to SQL database mail

How to configure database mail with SendGrid and use it for SSIS agent jobs.

Note: This is more for me as a reminder for the future.

USE master;  
EXECUTE sp_configure ‘show advanced options’,1;  
RECONFIGURE WITH OVERRIDE;  
sp_configure ‘Database Mail XPs’,1;  
RECONFIGURE;  

USE msdb;  
EXECUTE msdb.dbo.sysmail_add_profile_sp  
@profile_name = ‘EmailAdmin’,
@description = ‘Profile for sending Automated DBA Notifications’;

EXECUTE msdb.dbo.sysmail_add_account_sp  
@account_name = ‘SendGridSQLAlerts’,
@description = ‘Account for Automated DBA Notifications’,
@email_address = ‘‘,
@display_name = ‘SendGrid SQL Alerts’,
@mailserver_name = ‘smtp.sendgrid.net’,
@username = ‘‘,
@password = ‘‘,
@port = 25

EXECUTE msdb.dbo.sysmail_add_profileaccount_sp  
@profile_name = ‘EmailAdmin’,
@account_name = ‘SendGridSQLAlerts’,
@sequence_number = 1

USE [msdb];  
EXEC msdb.dbo.sp_set_sqlagent_properties @databasemail_profile=N’EmailAdmin’;  

EXEC msdb.dbo.sp_add_operator @name=N’EmailOperator’,  
    @enabled=1, 
    @pager_days=0, 
    @email_address=N’

thanks
Russ

Using Azure Powershell to attach to your azure subscription and create a new website

Hello,

As I haven’t posted for a while I wanted to just quickly post my findings on using Azure Powershell.

My aim here using only powershell is to:

  • Connect to my Azure subscription
  • Create a website in azure
  • Stop the website
  • Remove the website

Ok here goes.

Firstly you will need to install Azure Powershell so go here and work it out:

Then fire it up. It should look like this and you can find it in your windows start menu.


then we need to tell powershell about our azure subscription:


We just type Add-AzureAccount and you will be prompted to log into Azure.
Look here for info on this command:

After that you will see some confirmation that it fould your azure account:


You may need to change the execution policy for you to do things from your local Azure:


So here is my Azure website list before I add a new site:


So next I run New-AzureWebsite -Name MyPowerShellTest:

You can see it looks like it did something and returned info about my new site. If you look in Azure you can see the new site:



Nice!

Ok so for fun, lets stop the site:



So it worked.

Now ill delete the website completely using:
Remove-AzureWebsite -Name MyPowerShellTest

And it’s gone:


Well, you will have to trust me as I cant show you nothing!

thanks
Russ

What I found in 4 minutes in Visual Studio 2013 in an MVC5 project

I have installed Visual Studio this morning and here are a few initial things I have found.

I used the following version:

Firstly, after installation it asked my a couple of questions including what colour scheme I would like to use for visual studio. As you can see I chose DARK.
I thought this was a nice little welcome surprise instead of me burning my eyes out for weeks until I realise I could change the colour scheme.

Secondly, and I’m not sure if this is a good thing, I am logged into MSDN through visual studio:

I suppose it feels more personalised but lets see what happens down the track!

Another thing I noticed straight away is a little hint above methods that shows you how many time that method is referenced:

When you click on this it shows you the referenced code:

Another thing I think I like it that the old aspnet membership provider has been replaced with claims identification. I suppose it’s still a custom database supplied by Microsoft but, as I remember as I have used claims identity before

(https://github.com/brockallen/BrockAllen.MembershipReboot) it should be more loosely coupled and allow you to write cleaner code without being totally locked inside the membership provider. As far as I remember I was using dependency injection and the OLD membership provider kept on making me write ugly code.

They have also added some controller example tests by default. This is great as I remember when I started writing controller test I did not have any examples.
This should motivate developers to write more tests from the start which is good.

I have always hated the Visual Studio test GUI but now it seems it looks a little bit better and is now more like re-sharper:

I am not sure if this was here before but there seems to be a new notification window to give you updates on what is new. For example, an update to the nuget package manager.

Code coverage is also included which is something I would have needed to pay for before:

The final thing I found was when I was in a controller, for example, I could right click and select code map:

And then I can drag classes onto the canvas to see their relationships:

That’s my 2 cents worth and that’s all for now.

thanks
RuSs